This report isn't saying anything new; the PC market has been "dying" for some time now. I just don't understand why. In a sense I do - there has been a shift toward tablet and mobile computing. But I don't understand how either of these emergent devices take away from the PC market. Obviously there are people who have effectively replaced their PC with mobile devices and/or tablets, but I can't quite wrap my mind around the situations of these people. I can't imagine a tablet replacing my PC, and I'm not a particularly tech-y person, even. There are still a fair number of programs I use that I could not use on a tablet, ChemDraw for one. I guess I'm looking for insight into what sorts of individuals/employees have effectively replaced their PC with a tablet.
It makes me sad to see desktops begin to go by the wayside. It's not so much a matter of what one can and can't do on tablets or smartphones, its that people rarely OWN their device like I OWN my PC. I built it and I can do whatever I want with it and no walls stand in my way. Buy a smartphone on contract and suddenly walled gardens everywhere, locked boot loaders and locked stores and apps where communities wall themselves off. Even if you buy a tablet off contract where you can load whatever you want, the device itself will still be obsolete much faster than a PC which I can keep upgrading for years. It's incredibly frustrating to see 3D printing, arduino, etc which promise to free us from typical manufacturing paradigms coming along, and also see electronics (and the internet), once a bastion of intellectual freedoms, begin slipping under new regulations, contracts, walled gardens, etc. I want to really own the things I own. I want to be able to run the programs I want to run on the OS I want on the hardware I choose! Sigh...
You probably have no memory of the day Gateway 2000 started selling complete computers and opening the floodgates to a hundred million idiots who had seen an ad for CompuServe in the back of Omni. In the land of PC/Dos, you bought IBM, you bought Compaq, or - if you weren't a major multinational corporation - you rolled your own. First Gateway, then Dell, then EVERYBODY attempted to sell PCs to people who had no reason for them. The result was a glut of PCs with no resale value whatsoever. It's taken twenty years for the component manufacturers to figure out how to make their margins - and their answer has been to go after the gamers. The effect of this has been to winnow the field down to component manufacturers who can make modestly-priced, high-performance components using much smaller production runs than Acer or Dell or Lenovo or whoever can manage. So in the end, we're back where we started - with PC makers being a market segment rather than a market force and with most serious people rolling their own. Do not mourn the passing of the mainstream desktop. It is an anomaly in the flow of time, a side effect of every household running Windows 98 because it was the only way to check your AOL. We are better for its absence.
While I agree that most serious people will continue to "roll their own", the fall of traditional desktops does have an impact on those who will always make their own computer as well as on traditional consumers. From a practical standpoint I imagine it will mean more expensive components as fewer people buy them. It could also mean less competition as manufacturers leave the desktop market for mobile computing, and with less competition less innovation. Already most important new technologies in displays like OLED are almost entirely found in mobile platforms. It will be interesting to see when graphene or photonic or any other next generation chips begin to become available where they appear first, in desktops or in mobile. However, honestly, I'm not too worried about the practical hardware considerations, desktops will always have at the very least a market among gamers (like you pointed out). I'm more concerned about the general trends towards convenience and simplicity over freedom and deep content creation of which mobile computing is both a symptom and promoter. Cellphones especially, but mobile computing in general, wants to present information as quickly and as succinctly as possible. Consumers have bought into this as instant gratification and convenience are very nice. But we have bought it at a price, at least thus far, by getting devices we intend to replace every year or two, by getting locked into disparate ecosystems, by getting sound bites instead of a reality of complexity. There are certainly many apps and new OS's that try to combat this, but I can't shake the feeling that a society on mobile technology maybe sees more, but grasps less. This is all very cynical of me of course, and being a PC gamer makes me (probably very) biased.
Well, here's the thing. Backintheday computers were things nerds had. Then they were things everybody had. Smartphones were things nerds had. Then they were things everybody had. Tablets were things nerds had. Then they became things everybody had. What we're seeing is device optimization for the tasks people use them for - my dad used to have a 6,000 node network and by his statistics, 90% of the computer usage was Outlook and Internet Explorer. You just don't need a desktop for web and email. At the same time, the barrier to entry for manufacturing has gotten a lot lower. You look at something like the Raspberry Pi or the Pebble and you see ASICs and custom components on what are essentially boutique items. A run of a hundred video cards suddenly becomes economically feasible. Rapid prototyping has hit the experimenter shed. You can do things with Arduino that used to require an ASIC. CPUs are the province of four companies - Motorola, Apple, Intel and AMD - but CPUs used to be the province of Motorola, Cyrix, Intel and AMD so have we really lost anything? Alienware and Rosewill were never selling to Asus or Dell. The gamer gear is the gamer gear and shall be the gamer gear forever and ever amen. OLED is found in mobile platforms because there's no advantage to OLED on the desktop - lemme tell ya, I used to be a member of SID and the problems inherent in OLED are such that you want to avoid the technology unless there simply isn't a better choice. The kids are all right, man. I promise. When 95% of the world doesn't use anything outside the browser, we're better off not optimizing computers to run Internet Explorer. If an iPad will do everything your girlfriend needs to do, there's no reason to saddle her with an XPS.
While it might still hold true for accessories, I rather disagree with this sentiment in the context of PC hardware. Long long ago, I built my own computer. It was "pretty good" for the time. It could run games, pretty good games. It also costed well upwards of a thousand dollars. It has lasted many years, but nowadays, if I was wanted to buy something similar, it would cost me half the price. And that's not even taking into account the explosion of games in the horizontal direction of graphics. It's much less common that one requires "high end" to play any game on the market. Those toys are now targeted towards those with benchmarking fetishes (and, I suppose, professional digital artists).The gamer gear is the gamer gear and shall be the gamer gear forever and ever amen
Not sure what your point is. That you don't need the high end for gaming? Fine. I've always needed the high end, either for CAD or music. My first CAD machine had a hard drive that cost double what you spent on your "own computer." My current computer has software that costs double what you spent on your "own computer." So the high end will be there for those who need it. Which is my point. The dilution of Acer's desktop marketshare will not change that because for those of us who need a 12-core Mac Pro, Acer's marketshare has never been relevant.
Just that the market for gamers is rapidly shrinking as well. I don't have a particularly good picture of the size of the professional (soft|hard)ware market, other than everything in it being ridiculously overpriced compared to the cost of production. I can, however, imagine a day when even the computing power for video and graphics production (Audio, too?) is offloaded to a far off server, leaving behind just the tablet / keyboard interface as the computer client for the artist.That you don't need the high end for gaming?
>I can, however, imagine a day when even the computing power for video and graphics production (Audio, too?) is offloaded to a far off server, leaving behind just the tablet / keyboard interface as the computer client for the artist. That's because you don't understand the bandwidth required. A feature film at 2K is pretty much 10TB of data. Even offloading that sort of pipeline from your local machine to a machine down the hall invokes a SAN. If the gamers will no longer need high-end computing, then does it matter if the gamers are running on entry level hardware? And if you don't understand my world, how do you know it's "ridiculously overpriced?" Are there not economics of production there, too? Or are you turning opinions into facts in order to maintain your worldview?
feature film at 2K is pretty much 10TB of data. Even offloading that sort of pipeline from your local machine to a machine down the hall invokes a SAN.| Well yes, that's the point. Instead of building monster desktop machines, you build monster server machines and monster storage arrays, in bulk. 10 TB might be a lot to a desktop, but for AWS, it's a drop in the bucket. My last reference for video work (Also a few years ago) was being told that you'd want at least 16 GB for compositing, though more was always better. Let's pull numbers out of my ass and quadruple that for today. I can find one of those on EC2 already. Not only that, it's barely $2 / hr. Storing 10 TB? Fire up a couple EBS volumes and stripe your data across them. Note that I Am Not A Network Engineer (Professionally, anyways), AWS might not be the right service for the job. Chances are, this is not optimal advice. Additionally, you still need a service that provides a low latency connection to your artists' thin clients and I don't have a good reference of EC2 instances' latency versus time. But hey, it's a start? Now to mull through the rest of your questions: Above entry level, below $1k in final price. Costs are shrinking, though still not non-existent. Ridiculously overpriced from the point of view of an amateur. My impression was always that the numbers were set to what the market could bear. What prices make sense in the context of a $10m production still seemed insane to a teenage thundara wanting to toy around with 3d renderers back in the day.If the gamers will no longer need high-end computing, then does it matter if the gamers are running on entry level hardware?
And if you don't understand my world, how do you know it's "ridiculously overpriced?" Are there not economics of production there, too? Or are you turning opinions into facts in order to maintain your worldview?
The part you're missing is I need it in my house. I need the throughput. Here's a calculator. Red Raw, at 4k, is 36MB/sec. If I'm editing, I need minimum 2, more like 4, possibly 6 or 8 streams. Now we're up to 70, 140, or 280 MBPS just to pull the media from the drive to my workstation. I'm on a pedestrian movie right now. I've got fifteen tracks of dialog. Each one of those is 48kHz, 24-bit... except I'm working at 32 bit FPU. That's 23MB/s before we even get into the beds, the FX, the foley, the music, or any of the rest. My data pathways are such that if I have the video and audio on the same SATA drive I get crunches. This is why I said SAN - because I expected you to notice the different varieties of exotic file transport protocols necessary to make this shit work over distances. Have you ever had to seriously consider a fiber channel card? I'm pretty much there the minute I start moving this shit into another room... and I'm suddenly in the land of SAS drives. So yes. I know what AWS is. No, it does not work for what we do. "At least 16GB for compositing?" I don't even know what that means. We're friends. Which is why I say, with affection, "you're talking out of your ass." I'm not. Don't make us both upset by deliberately misunderstanding things I do for a living to make a point you can't support.Well yes, that's the point. Instead of building monster desktop machines, you build monster server machines and monster storage arrays, in bulk.
Ridiculously overpriced from the point of view of an amateur. My impression was always that the numbers were set to what the market could bear.
Technically under a gigabit, and the same is true almost up to 4 streams, but that's just copy time. Once it's there, the bandwidth between storage and instances would easily meet that requirement, then all you need transferred back to your end is the current image / audio. Still, as much as I hate to say it, residential internet speeds are still outrageously slow, variable, and expensive, but I'm just speculating about the future for fun right now. (If you're working in a studio, I would think it'd be less outlandish to find those speeds to the wide web, but I'm unsure of your current setup) 16 GB of RAM for video editing / compositing (Not trying to talk down, just unsure if the "video" prefix is applicable or redundant to the latter). Fair enough :PRed Raw, at 4k, is 36MB/sec.
"At least 16GB for compositing?" I don't even know what that means.
We're friends. Which is why I say, with affection, "you're talking out of your ass." I'm not. Don't make us both upset by deliberately misunderstanding things I do for a living to make a point you can't support.
...so you're seriously advocating a workflow where I'm reliant on "the cloud" for monitoring and editing 4k video? Now you're just being ludicrous. No, you're wishing and presenting it as fact in order to argue that you have a leg to stand on. You don't. if gigabit ethernet isn't fast enough for data transport for what I do, there will be no WAN fast enough for the foreseeable future. A T3 is 4mbit. You're stating that somehow 1Gbit is going to happen at a pedestrian level... all so that I can put my 10TB per movie on someone else's server. You misunderstand me. I know what compositing is. I'm arguing what 16GB has to do with it. You're now arguing for RAM on an individual machine, while my argument has been (and has been clarified three times now) that the issue is even with the skookum fast machine in the sky, the pipe betwixt here and there cannot be made fast enough on an internet backbone. Just to drive home a point, I've got 20GB of RAM and it isn't enough. 64-bit workflows allow RAM caching and my next machine will probably have 64GB or more. And I don't do video.Technically under a gigabit, and the same is true almost up to 4 streams, but that's just copy time. Once it's there, the bandwidth between storage and instances would easily meet that requirement, then all you need transferred back to your end is the current image / audio.
but I'm just speculating about the future for fun right now.
16 GB of RAM for video editing / compositing (Not trying to talk down, just unsure if the "video" prefix is applicable or redundant to the latter).
I've been saying from the start that this is a one day thing and not a today thing. I'm also pointing out that the initial transfer operation is a one time thing. Yeah, it's slow. But it's a day's worth of copy time on gigabit (Don't know why you brought a T3 into the equation, and your number is off by an order of magnitude). It's absolutely possible, and the intention in the long run would be to lower costs. Instead of rolling your own storage array, instead of building your own monster machine, you buy a little time on another network that has been optimized to do all these things on a massive scale. This would be turning that $5-10k workstation into a cheap front-end + monthly server costs. Perhaps high end video editing is too high of bandwidth to be worthwhile. But I'm still not convinced either way that this is inapplicable to any digital work.No, you're wishing and presenting it as fact in order to argue that you have a leg to stand on.
You're stating that somehow 1Gbit is going to happen at a pedestrian level... all so that I can put my 10TB per movie on someone else's server.
I am glad that hardware experimentation is on the rise with arduinos/pis and rapid prototyping. As a current CS major I love my arduino (still need to pick up some servos for it) for being able to do embedded systems really easily. I hope these open source projects become foundations for all sorts of things. Also, I'm perfectly ok with 99% of users only ever using computers for a few things. I just don't want to see that limiting my ability to use computers however I want :P
You don't really need to replace your PC every few years to keep up with technology anymore. The average user needs to be able to run a modern web browser and play streaming media, computers from five years ago can do this. Many gamers are still trying to stay on the bleeding edge but I think upgrade pressure for them has even slacked a bit. Seems like we are on the edge of a big revolution in integrated graphics which will probably be a blow to video card manufacturer's, lessening the need to upgrade some components in a way similar to the decline in the need for dedicated sound cards.