Advertisement

How VR Will Ultimately Be As Comfortable As Your Home Theater

Virtual reality promises limitless potential in allowing people to co-experience immersive, simulated worlds. In static scenes where the operator is in a chair, the latest generation of hardware supports VR experiences that are quite comfortable. At the opposite end of the spectrum is the VR fantasy of sci-fi, where an operator dons a suit, sits in a chair, and immerses in a highly dynamic VR scene complete with running, jumping, and falling. Experiences such as these still create discomfort for some VR consumers, including nausea and visual fatigue. This has prompted game and experience developers to invent creative work-arounds that reduce apparent user acceleration in dynamic VR.

There are two primary reasons why VR users may feel symptoms relating to motion sickness. Firstly, high latency can induce sickness because the visual output comes later than the user's brain expects. Humans can detect very small visual delays, which can amplify and lead to a disconnected and unfulfilling experience. Secondly, users tend to feel discomfort when there is a discrepancy between what the visual and vestibular systems are telling the brain - also known as sensory conflict. This mismatch occurs when VR provides the visual sensation of movement, even though the user's physical body is not moving at all. When the brain processes only the visual sensation of movement, it creates a discomforting feeling that increases with the frequency and size of the acceleration.

Developers have been making considerable progress with respect to hardware and software enhancements in order to resolve high motion-to-photon latency. The advanced high-resolution OLED displays and state-of-the-art hardware components developed by Oculus, HTC, and Sony can now target an ideal latency that is at or below 20 milliseconds, which is generally imperceptible to most users. At 20 milliseconds, most VR users sitting in a chair observing a static (non-moving camera) scene are comfortable. Anything above 20 milliseconds tends to be less comfortable and less immersive. Another method to reduce latency is predictive tracking technology, which renders visuals according to where it anticipates the user will be looking, to further compensate for delays. Over time, these hardware components will be continuously improved to increase comfort and immersion for all users.

Hardware and software must work together to achieve a latency of 20 milliseconds or less. For starters, the rendering performance of the VR game or application must be maintained at the refresh rate for the head-mounted display (HMD). This refresh rate is 90Hz for the Oculus Rift and HTC Vive, whereas Sony's PlayStation VR can render at either 60Hz, 90Hz, or 120Hz. The app must also be architected cleanly so there is minimal delay between user input (head tracking) and rendering. Another important component between the app and hardware is the operating system and drivers. NVIDIA, AMD, and Microsoft have worked on reducing the graphics driver latency, making it possible to reach sub-20 milliseconds latency, even when other activity taps the CPU. When the entire stack is working together – HMD tracking hardware and software, USB driver (and operating system USB stack), game update code and rendering code, graphics API, graphics driver, operating system scheduler, graphics card, and HMD display – 20 milliseconds can be achieved.

Sensory conflict arises from induced motion, which is typical of seated VR experiences. Seated VR experiences that are operated with a game controller and involve linear or angular movement are more likely to cause discomfort, because the visual representation is moving at a faster acceleration than what the user's vestibular system (inner ear) is feeling. Room-scale experiences solve this problem by allowing the user to move freely in their physical environment. One example is Valve's Lighthouse technology, which is used in conjunction with the HTC Vive to accurately track users' movements in real-life and display them visually in real-time. This allows the user's visual and vestibular systems to remain in sync. More recently, entertainment technology company vMocion introduced patented electrodes that eliminate VR sickness in people by stimulating the user's inner ear into thinking that their body is experiencing physical motion. Over time, most individuals adapt to virtual environments with repeated exposure. Similar to how sailors get their sea legs after multiple voyages, VR operators also acclimate to the system after periodic use.

Software developers and content creators have been making significant headway in preventing sensory conflict. Oculus has published a best practices guide that advises developers on ways to alleviate symptoms. Suggestions include minimizing the frequency of camera accelerations, as well as allowing the player to teleport from one place to another. Games such as Damaged Core, World of Comenius, and Blink VR utilize this teleportation mechanic to great effect. Many other titles, like InMind VR, use a fixed visual reference object, such as a cockpit or dashboard, to keep users grounded in the experience. Investigators at Purdue University found that adding a "virtual nose" as a point-of-reference allowed experiment participants to stay in a simulation for an extended duration without feeling adverse effects. Immersive social VR experiences hold much promise for sensitive users. Unlike action games that involve jumping and falling (high acceleration), experiences that involve walking around, chatting, and interacting with the environment tend to have lower accelerations.

We are well along the path towards perfect VR. Billions of dollars are riding on its success – at least $4 billion since 2010. Hardware and software developers have solved most of the comfort issues that plagued VR in the '90s. As hardware is perfected, we will see market forces identify the future killer apps of VR. And the first wave of VR killer apps will all be comfortable for the average consumer over long durations of interaction time.