http://store.steampowered.com/universe/ Look at them go! Oculus or Valve...? Valve's VR looks quite impressive, not at all sitting focused like the Rift.
From what I've heard from someone who's heard from people (So take this as completely unreliable information) the Valve VR is apparently waaay less a burden to wear and a generally better experience than Oculus DK2's. I can't speak for -either- mind, but I was getting a fair bit of neck strain after 30 minutes wearing an Oculus DK1
Interesting. Never really had trouble with the Rift but then again I haven't ever worn if for more than 15 minutes.
TRULY A MARVEL IN HUMAN ENGINEERING! But yeah, every other player has an oculus. Facebook: Oculus rift. Microsoft: That hologram thing. (Me likes) Valve: SteamVR. Google: Googleglass (?)
Google Glass is augmented reality. That's mapping virtual data onto reality. It's different. AR gives you a HUD, VR gives you a world.
That WILL happen, mind. And everyone will praise it as being the true pioneer of Virtual Reality. It'll cost $700 and be completely closed off as a system
Honestly, the Steam controller is the most interesting part of this whole thing for me. I can't wait to pick one of those up.
I'm not really sure I see the point of the Steam Machine. I mean, I get they'll be more powerful than current consoles, but for the same price you can probably get a PC with better parts. Not to mention that controller looks fugly. At first I thought it would be for splitscreen co-op, but 99% of PC games have abandoned that completely, so yeah I have no idea why it exists Though I am interested in the VR thing, looks way more comfortable than Oculus.
not much other then the answer to console gamer's "hey what about us?" valve - "you absolutely NEED to destiguish between console and PC??? why? Gah! you know what? here you go."
Google Glass was billed as AR (Augmented Reality), but really it's only a HUD (Heads Up Display) which is different though how those are different are often lost. AR is supposed to interact with the world, seemingly exist in it. A HUD just sits on top. In game terms your health display is usually the HUD, an outline around your selection or a health bar that tracks above something would be AR. Google Glass simply didn't have the processing power or display tech to pull off AR. There's a number of examples of AR out there already, most commonly using a physical card with a pattern on it and pointing a camera at it causing something to show in "the world" on the display. Many of Sony's EyeToy games have been this style. One of the original demos for the 3ds did this as well. A non game example is Word Lens, recently bought by Google. This is the original video, and is an example of AR: This is a later version for Google Glass, which retains some of its AR features, but ends up more HUD like. The hard part is getting the AR part of that actually working overlaying what you're seeing and not as a separate display. So far only Hololens has shown that kind of functionality, though there are others who say they're working on it or have solved it. edit: Also, HoloLens is in no way holographic, the use of the term is pure marketing. There is research that has created true holograms in real time, but that's not what HoloLens does. The actual display tech in HoloLens isn't much different from putting a display above you and reflecting it off of some glass. The actual tech used is more interesting than that using wave guides, but the result is the same. The big things HoloLens has going for it is using a Kinect 2.0 style camera to be scanning the world in front of you and getting the latency from the tracked locations to display down so low.
So then the Microsoft hologram thing would be classified as AR while Google glass just a silly HUD? And of course we know it's not _really_ holograms, but whatever they're still pretty fun looking. x3
Google Glass isn't really AR, just a HUD on your eye, it doesn't affect reality at all, doesn't add anything really. The hologlass actually adds these things to change reality in a way.
Yes, HoloLens is AR. It seems to solve or at least deal with one of the three remaining issues with AR which is the world scanning / depth sensing. It only solves it part of the way though as the Kinect 2.0 (IR time of flight camera) can only see about 20 feet or so. Anything beyond that is unknown, and the range is even shorter in sunlight. Google's Project Tango is trying to solve the same issue too, but with a Kinect 1.0 style tech (which is quite different, but cheaper, structured light depth). One of the companies that Oculus just bought is doing depth sensing that works over much longer distances using structure from motion. The future might be some fusion of the techniques as time of flight can be much more accurate but structure from motion has near infinite distance capabilities. The other two issues remaining for AR are depth of field and occlusion. Depth of field has been somewhat solved with the various light field style displays, but those require another order of magnitude of resolution increase over the basic stereoscopic VR we have right now to be sharp. There are also a couple of displays that solve the issue by actually being in real world 3d space, usually by projecting onto a very fast spinning surface, but that's not a solution for AR / VR if to display something you need a box the size of the thing being displayed. The last issue is occlusion. This is where VR might actually end up over taking AR. Occlusion is blocking out the real world where you're displaying on the AR. Seems like a simple thing, just need an old LCD to mask out! But it's much more complicated than that because of that pesky depth of field issue above. A light field VR display with a light field camera could solve the issue by basically being a pass-through camera that gets masked rather than trying to mask the real world. This is pure conjecture but those two issues might be what Magic Leap has solved. edit: Some links - Light field displays: http://www.holografika.com/ - One of the first, they've been around for nearly over a decade, before "light field" was even a term really being used. https://research.nvidia.com/publication/near-eye-light-field-displays - NVidia's light field display tech. The main researchers on this are at Oculus now afaik. It does a decent job explaining what light fields are too. Depth sensing: http://13thlab.com/ - The company Oculus bought that does "structure from motion" depth. Time of flight and structured light depth you can research on your own.
Hololens makes things show up in the world as a 3d environment that you can interact with with a movement recognition system similar to kinect, Google glass is nowhere near that? O . o