$200 and 90% as fast as the $700 Nvidia 1080 cards. Crossfire two together for $400 and you beat a card worth almost twice as much. AMD stock is also starting to climb in response. I' love to swap out my SLI pair that is starting to struggle, and may have to fork over some money in July when these cards are actually available. My hope is that AMD makes enough cards so that they don't have the problem NVidia is having right now- lack of product.
The 8GB version of the 480X is going to retail for $240 and will be able to work with the Vive and Rift at full framerates.
There is nothing that has changed about VR in the past 30 years that will increase its penetration over what has come before. We're still assuming that people want to be completely isolated from their friends in a world without relative perspective. Oculus and all the rest are going to be the Guitar Hero of 2016. Lotsa people will buy it, lotsa people will go ooh aah, lotsa people will realize that they'd much rather play Madden from the couch where their friends can watch.
This is definitely a problem I keep on hearing more and more frequently about. Many, many people just can't afford to isolate themselves from the world around them for very long stretches, which is what the setup and fantasy of VR suggests. As I get older I find that I spend less and less time marathoning games or being able to at all without needing to get up or check my phone or respond to emails or finish cooking or whatever else video games have traditionally not been so demanding to ignore just by the virtue that we can easily put down a controller and walk away. I think if VR's going to stay, it needs to have some real-life integration - some way to view your phone, a way to browse the web easily, maybe even a way to work within the space - instead of being a totally confined, singular gaming experience.
There is a very large growing gaming market for short-ish games (2-3 hours of gameplay) to mid-length games (12-ish) that are all about immersion on various levels while being first person. I think VR will head in that direction since most of those stories are about screwing with your head anyway. Games like SOMA, Firewatch, and Alien: Isolation, for instance. Hell I'll even throw in the Chronicles of Riddick: Escape from Butcher Bay in that mix. I think these types of games will be the real drivers of VR gaming markets, not just another Wii or Guitar Hero like it is now. Those game can also be made in dual compatibility with flat screen support, so they can sell two interfaces to it. Much like 3D movies have a 2D version, which seem to be lasting much longer than I expected. Also, it could play off of 3D movies and end up with a way to play 3D movies without buying a ridiculous TV and player for it, making them last longer. I've always wanted to pause a movie and walk around the set. You could turn your living room into a digital theater, shakespeare style. Pause, walk around, etc. See live concerts streamed to your living room stage. Obviously I'm not buying one now, they are ridiculously expensive with little payoff at the moment, but if they do things right they could take off.
I definitely don't see 'COD:BLOPS INFINITY OPERATIONS - The Beginning' as the market for VR. I think that looking for that kind of mass-appeal with this tech is dumb and impractical. However, I believe that there is a growing niche for people like myself who want the experience that the tech is able to provide.
You believe it because you've been hyped into believing it. Speaking as a former member of the Society for Information Display I can say with no quaver in my voice that the compelling reasons for VR are no more compelling than they were in 2000, in 1990, in 1980 or 1970. VR is cool. For about half an hour. Then it's wearing, isolating and fatiguing. Always has been, always will be. DARPA pushed this shit as hard as they could back in the early '80s. There was no impediment to their progress. Yet they abandoned it by '86 because no amount of miniaturization or refresh rate addressed the fundamental problem with VR: we don't really see in stereo. We interpolate from mono, with heavily-lossy processing, because binocular vision gives us depth cues only to within what we can reach with our arms. Everything else is head position. Every VR setup you've ever seen (except one) uses binocular vision for the sum total of your stereoscopic experience. As a result, every VR setup you've ever seen forces your brain to do things it can't do. That's not going to change until we ditch the immersive glasses. Period. Full stop.
Eh. I've used a Vive. I've used a DK2 for an afternoon. It's not something I wanna do all day every day, but I can say the same thing about Overwatch or Kerbal Space Program, or basically any other game. Don't forget that for all its one-hit-wonder-ness, Guitar Hero was FUN. (To its target market)
Having played multiplayer Quake in a CAVE, I don't think VR gaming is necessarily an isolating experience, but I agree that it's not likely to become a widespread thing anytime soon. I predict another hype->inflated expectations->letdown->everyone forgets about VR for another decade cycle. The VR experience people imagine and expect is always going to be better than the one the technology can deliver.
Yeah, LANL built one of those back in 2000. Nobody used it, it was a pain in the ass, and they decommissioned it in 2008. Think that sucker was built out of command & control multipanels. Think it cost in the millions. Think the novelty is exciting, but think the utility is marginal at best.
Some friends and I built one with displays dumpster dived from a theme park and some old SGI boxen dumpster dived from the engineering department. LANL's was probably much nicer than ours, but they don't have to be that expensive to build, bored college students can figure out how to do it for almost-free, you just need to dedicate a lot of space to something you'll rarely have a use for.
Yeah, I'm not really into VR at all. I really don't want to spend another $400 - $800 to upgrade my computer to use an $800 pair of glasses without wanting to throw up. Plus, I'm lazy when it comes to peripherals, so I'd probably use them once, go "Wow, that's a major hassle to play like 2 games," and never really touch them again. I haven't seen a VR game I'm interested in, nor do I think any game I currently play would be helped by wearing a giant pair of glasses to make it my whole vision. Yet, it's supposed to be the next awesome thing that's going to revolutionize gaming. Really, I'm upgrading in hopes of either getting a 1440p monitor, or a 1080p 120hz display.
The reviews are in and the consensus seems to be that the new AMD cards are underwhelming. They seem equivalent to the last generation mid grade Nvidia enthusiast 970's. They are cheaper than a 970 and they use less power. Nice if you are building a system on a budget but not stunning. Seems like there might be a lot more action from AMD and Nvidia in the near future. Hopefully the AMD offerings will get a bit juicer.
People are idiots. Fanboys are morons. And the jack-holes on Reddit and OCForums are the kings of the shitpile. From the very first announcement of these cards, AMD told EVERYONE that these were a mid-range card with a bit of overclock-ability. One of the first non-PR flackey talking points was that the RX480 8GB card was more in the 980 performance range, but at 1/3 the price. The goal of the 480 series is to grab the 85% of the normal person market, and sometime later to put out a high-end card. EDIT: I'm one of the morons hyping the hype train. Looked at the OP and noticed I said that a single card was 1080 level, meant to say TWO cards are 1080 level performance, at less power and 2/3 the price. So I'm a part of the failure to manage expectations. I hang my head in shame. This article from the launch PR has a great pull quote on this: The whole point is to lower the cost of entry to VR and PC gaming. The announce press conference stressed "mid-range" graphics at the best price/performance (performance per watt in the announcement) on the market. The hype trains got wound up, way up, and I think that there were too many people expecting a $700 card's performance for $200. The benchmarks look about what I was expecting. Two cards together will push out about the performance of a GTX1080, but at 2/3 the price. One card is about twice the output of my SLI system that I have not upgraded in three years, and was looking to try out Crossfire versus SLI. As soon as the cards are in stock on Newegg, I am going to pull the trigger and get two of them.“The Radeon™ RX series efficiency is driven by major architectural improvements and the industry’s first 14nm FinFET process technology for discrete GPUs, and could mark an important inflection point in the growth of virtual reality,” said Patrick Moorhead, principal analyst, Moor Insights & Strategy. “By lowering the cost of ownership and increasing the VR TAM, Radeon RX Series has the potential to propel VR-ready systems into retail in higher volumes, drive new levels of VR content investment, and even drive down the cost of VR headsets.” via AMD
They seem like nice cards if you are on a budget. I'd go one Nvidia before I'd go two AMD. Sounds like AMD has some more aggressive cards coming in a few months and Nvidia is going to announce some lower grade cards this month or next that might pull the shine of AMD's offerings.
Yea, sucks right now to have money to burn. I sort-of need a new graphics card as I am now getting dips into the 35-40FPS level on my games, but I really need to wait and see if there is going to be a price cut on the 1080. If they cut the 1080 to $500 that is the way I will go. Unless I get 100% fed up with Windows and go full Linux, then I have to go AMD due to the drivers working much better on Linux and AMD's commitment to FOOS. I still want to puck up to of the cards and at least play with them and tinker with overclocking. I have a group of friends who can always use my hand-me-downs and will pay me 1/2 what I shelled out for the cards as well. I'm giving everything a week. If by the end of next week the cards are in stock, I'll have to post a report on my findings here for you all.
http://www.pcper.com/reviews/Graphics-Cards/Power-Consumption-Concerns-Radeon-RX-480 A bit about the power draw. We'll know in a week or two if it's really a mother board melting problem I'm a fee weeks. Skip to the end of the article to avoid all the technical blather.
Two things. Over clocking benefits were minimal and your going to need some decent cooling to get your 1-4% improvement. sounds like the software for over cooling is better than ever. The card is pulling twice the power most motherboards are rated for under certain conditions from the motherboard (over clocking and some other times). Don't buy one until this is cleared up.
Benchmarks on cards that aren't even out ran against the last generation of Nvidia cards? People have had the new Nvidia cards for a few weeks to benchmark, and they are blowing last gen of Nvidia cards out of the water. I'd wait a few months and let people poke at all the new cards for a bit, see what's really what. Generally better to buy the best single card you can afford now and than SLI/Crossfire later when prices come down when you need an upgrade than to replace both cards later. Not everyone feel that way. Hope the best for AMD, the better they do the better off we all are. Nvidia and Intel could both do with some more competition.
That's how I feel but there is a hardcore slice of the gaming world that feels differently. It doesn't seem to be a bad upgrade solution if your single card isn't cutting it anymore and a matching card has become dirt cheap because it's a few years old but I've never gone down that path.
The majority extreme gaming types have more dollars than sense. That said, I'm sitting on two Titan Xs, but I use them primarily for crunching, because they're much cheaper than cards marketed for that kind of work. For that reason I like the computer ricers.
That's nice to hear. I was going to join the dark side with a 1080, but I might just go ahead and get the 480x (Maybe 2, if I'm feeling cheeky). I'm just trying to max out 1080p, with ideas of moving to 1440p or 1080p/120hz, and I think these might do it. I have an i5 3570k, which I have been told would not really bottleneck the new line of GPUs, so I might have a cheap way of getting good frame rates for really cheap.
Dude I bought a GTX980 like six months ago for like $350. A replacement card for my six-year-old Mac Pro was $450. Frickin' PC stuff is always cheap. Y'all don't know how good you have it. Especially when your year-old Mac Mini has a corrupt hard drive, and Apple has not only slapped a proprietary connector on it, they've slapped proprietary firmware, so you can't even swap the bitch out without bringing it to the genius bar. Apple and I are rapidly parting company. Thinking of building an XPenology box to replace it. Could probably just grab some piece of shit off Woot and install, but I kinda want a decent video card on it so it'll transcode Plex with alacrity (the Mac Mini doesn't transcode worth a shit either).
About every 4-5 years I do a ton of research and drop $2-$3K on a game rig. I have an i7 and 32GB of RAM build in 2012 according to Newegg that is laughing at everything I throw at it. I'm starting to get less than 60FPS at 1920x1200 so it is time to slap in a video card or two and another 10TB of storage for the astronomy camera stuff and I'll be good for at least 2 more years. MAC hardware is expensive by design to prevent people from messing with it. One of the reasons that MAC machines are good is that they are all the same hardware, making it easier to do driver support, as well as tech support. But as a guy sitting here with the two sides off the game rig playing with the cable runs to make it look nicer and easier to access for when I get the new video card(s) next month, I'd go bonkers on a closed ecosystem like MAC.
There was a good one for a while after the switch to Intel. Mac Pros used nVidia and ATI video cards. There were PCIe soundcards you could throw in there. But yeah. There was about a 6 year period where you could legitimately run Mac hardware as if it were PC hardware, and that era is done.