NASA's Virtual Visual Environment Display (VIVED)
During the '80s and '90s, the public was gripped by VR fever. Computer scientist, writer and former Atari researcher Jaron Lanier popularized the term "virtual reality" (VR) to describe the immersion of one's body and mind in an artificial, three-dimensional space, and a variety of products hit the consumer market aimed at connecting people with this new digital environment. But the federally supported R&D sector was where significant investments were made in practical applications for the technology. NASA's Ames Research Center played host to a VR research project launched by Michael McGreevy in 1985 and within a year it was ready to show off a working prototype of its Virtual Visual Environment Display (VIVED) helmet at CES.
Key developments in simulated environments can be traced back to 1957, when cinematographer Morton Heilig invented the "Sensorama" booth and "Telesphere Mask," where video, sound, vibration and wind were used to replicate a real-world experience. In 1968, Ivan Sutherland went one step further when he built a head-mounted device using mini CRT displays to produce an immersive graphical simulation. It adjusted the user's view inside a 3D environment according to their head movements in the real world.
In the 1980s, virtual reality experienced a renaissance, as both private and government institutions made a push to advance research in the field. Films like 1983's Brainstorm brought VR into the public eye and game developers were beginning to catch on to its potential in the arcade.
NASA's launch of its VIVED prototype at CES in 1986 was aptly timed, arriving amidst a surge of public interest. Built for about $2,000, the video helmet employed a wide-angle, stereoscopic display system (using VR pioneer Eric Howlett's Leep optics), and incorporated voice control and gesture tracking with its glove-like peripheral. The closed-front visor included two 2.7-inch medium-resolution, monochromatic LCD screens, which provided a 120-degree effective field of view for each eye. A helmet-mounted sensor tracked head motion in real time and could provide full motion parallax and perspective based on the user's movement. To complete the illusion, NASA incorporated surround sound, which provided spatially dictated audio cues to enhance the feeling of a realistic 3D environment. Speech-recognition was also part of the package, allowing the user to issue commands to the system using a standard conversational tone.
One of NASA's primary applications for VIVED was in space telerobotics, giving astronauts the ability to control extender arms, cameras and even humanoid robots in order to accomplish dangerous tasks and exploration from a safe location. Pairing a display with tactile interactive devices, such as flex- and motion-sensing gloves, gave the operator enhanced control over devices by providing human-like sensory input as if experiencing the environment firsthand. VIVED also had applications in computer science, like a Minority Report-style interactive workspace, referred to as the Virtual Interface Environment Workstation (VIEW), where the user enlisted speech and gestures to interact with devices, objects and data, and could view, reposition and delete files.
Virtual reality's initial foray into the retail market was relatively short-lived compared to its longevity in the R&D sector. During the '80s, Lanier's VPL Research became a leading supplier of VR tech with products ranging from its DataGlove input device to the EyePhone head-mounted display. In 1989, Mattel joined the party with its Power Glove controller for the Nintendo Entertainment System and full-sized immersive rigs started popping up at malls, running the low-res Dactyl Nightmare video game, which looked like something out of a Dire Straits video. Stephen Ellis, who headed NASA's Advanced Displays and Spatial Perception Laboratory at Ames, proffered that "the technology of the '80s was not mature enough," with insufficient graphics, glitchy interfaces and poorly developed tactile feedback. As people realized that the tech didn't live up to the hype, interest and funding in the products dwindled and the market-leading VPL Research filed for Chapter 11 protection in the early '90s.
Although public interest in VR cooled off, space was still the perfect deployment arena, and NASA's virtual reality research continued. Its 1997 Robonaut project focused on creating a humanoid robot to perform tasks in place of human astronauts, either autonomously or controlled by a virtual interface. By 2011, the project reached its second stage of development and NASA deployed Robonaut 2 for duty on the International Space Station. The technology used to interact with Robonaut has drastically improved since VIVED was developed, with interface devices like Sensics' piSight display providing up to 6 million pixels per eye (with tiled optics) along with a panoramic field of view up to 166 degrees. Even the consumer market has picked up again, with devices like the $300 Oculus Rift VR display, which offers a 512,000 pixel-per-eye resolution and 90-degree field of view.