VR and AR may be the next big thing immersive experiences, but so far, their user interfaces have been anything but intuitive. Conventionally, head-mounted displays have operated under the assumption that its users are owls: their eyes are locked in their skulls, facing forward requiring them to use their noses as VR cursors. Tobii is working to change that by integrating eye tracking into the next generation of head-mounted displays.
Specifically, Tobii is partnering with Qualcomm to incorporate its EyeCore tracking capabilities to HMDs running Qualcomm's Snapdragon 845 mobile VR platform. This promises to dramatically improve how we interact with the content of and other users within our VR and AR experiences.
For one, the EyeCore system offers Foveated Rendering. This means that the system renders whatever you're looking at in high definition. However the rest of the field of view that you are not focused on is rendered at a lower resolution. This reduces the load on the GPU, improves battery performance and reduces the amount of heat the system as a whole produces. Additionally, the EyeCore system eliminates the need for users to manually tune the HMD's interpupillary distance (i.e. how far apart your eyes are), automatically calibrating the headset's lenses to whoever is wearing it.
The Eyecore system also offers a number of other advantages over conventional HMD interfaces. In online social interactions, for example, it can make avatars appear more lifelike. Instead of the standard, 1000-yard deer-in-the-headlights look that VR avatars have, you'll now be able to make eye contact with whomever you're interacting with -- or at least give them some wicked side-eye. In gaming situations, Tobii's system can improve hand-eye coordination since the system is empowered to infer the user's intentions based on what they're looking at rather than just what your head is pointed at. During a demo on Wednesday, in which I was tasked with throwing rocks at virtual bottles, the ability to look at my target with my eyes rather than my nose dramatically improved my accuracy and made the entire process feel far more natural.
Even in more mundane applications like navigating VR menus, eye tracking can vastly improve the user experience. Conventional VR menus work a lot like the PC paradigm. First you look at what you want to select, then you have to use your hands to guide the mouse/controller cursor to hover atop the item you want to activate before clicking on it. With eye tracking, the experience is far more intuitive -- like modern mobile UIs. Basically, you just look at the menu item you want to select and simply click the controller button. It doesn't sound like a big deal but actually reduces the amount of clicks needed to enjoy your VR Netflix offering by a third.
Overall, I was very impressed with how much easier VR applications were to use with eye tracking enabled. Not only that, it also reduced the amount of strain on my neck (since I wasn't having to whip my head around to look at everything outside of my direct line of sight), reduced the effects of the HMD's weight and generally made the VR experience seem more natural. I can't wait to see what other applications VR and AR developers decide to work this capability into with the next generation of HMDs.
Click here to catch up on the latest news from GDC 2018!