So... telepresence and Google Maps. Yeah, those are kind of the obvious applications for deep VR googles. The problem is that without motor control over what's going on, the VR side of things becomes a passive experience... which is so much better on the other side of a screen. The VR peeps figured this out 20 years ago. Google maps? Yeah, we do just fine with a GPS on our dash, not our face. And that, ultimately, is why wearable computing languishes. Head-up displays are, as I mentioned, of limited utility. Here's the cockpit of an F-16: Here's the head-up display of an F-16: craptons of information, but only about 10% of the info readily available anywhere else. Meanwhile, here's the cockpit of a Corvette C7: Here's the head-up display of said-same: Again, maybe 10% of the instrumentation available... and I'd argue that the overwhelming majority of that is extraneous info (do you really need to know how many lateral Gs you're pulling in real-time? It's not like you're going to black out on a hairpin). Then we get to the cockpit of an everyday human being. There are zero bits of information we need access too instantaneously. I can look down at my watch just about as quickly as I can look down at my speedo. GPS? Yeah, I'd love to have one in my helmet when I'm riding the motorcycle, but that's mostly because throwing an iPhone in the tank bag runs the battery down. I sure don't need to be able to design satellites in open space. Speaking as a former CAD guy, "hand tracking" is thoroughly trounced by "a good mouse" every time. So yeah - the laptop is kind of where you get to. There isn't much compelling need for any of this shit, so it becomes all the bells'n'whistles fyoooochur that looks cool? ...but isn't actually beneficial. 6-axis mice have been available for about 30 years now. I've known what to do with them for 20. Nobody uses them, though, 'cuz the boost you get in exchange for all the twiddlyness really isn't worth it for the majority of operators. Wearable computing in a nutshell, I'm afraid.