If you think using the Leap Motion controller for playing air guitar and typing without a keyboard was cool, try using it to control a NASA rover. Victor Luo and Jeff Norris from NASA's Jet Propulsion Lab got on stage at the Game Developers Conference here in San Francisco to do just that with the ATHLETE (All-Terrain Hex-Limbed Extra-Terrestrial Explorer), which was located 383 miles away in Pasadena. As Luo waved his hand over the sensor, the robot moved in kind, reacting to the subtle movements of his fingers and wrists, wowing the crowd that watched it over a projected Google+ Hangout.
We spoke with Luo and Norris after the panel to gain further insight into the project. As Luo explains, one of JPL's main goals is to build tools to control robots needed for space exploration. Seeing as the gaming industry is already rife with user-friendly controllers ripe for the plucking, it made sense to harness them for the job. "We're very used to the bleeding edge," he said. "From the Kinect to the PlayStation Move, they represent major investments into usability." Hit the jump for our impressions of the simulation software, a look at JPL's grander goal and for video clips of the demo and panel itself.
In the case of ATHLETE, using Leap Motion was an easy decision. Designed to be part of a lunar / martian exploration system and now slated for a potential asteroid mission, the massive 12-foot tall robot has half a dozen limbs each with six degrees of freedom, which lend themselves naturally to gesture-based controls. The crew had already built a hangar along with a series of trusses and pulleys to position ATHLETE on an asteroid test bed in order to simulate low gravity. They then mapped the physical space onto Unity-based software, which was configured for use with a variety of controllers, including the Leap.
We had a chance to try out the software ourselves, but without a robot connected at the other end. This editor hovered a hand over the sensor cautiously, and sure enough, the simulation responded like a giant claw. With a hand balled up in a fist and just one finger extended, we watched as the bot lifted a single limb as well. We were surprised by how responsive it was, though it might not be as precise as we would like in our incredibly brief demo. Of course, you can't tilt it too much and the legs can't bend backwards, but that's because the rover isn't supposed to do that either. It was a little unnerving, thinking that this flailing of fingers could translate to the movement of a robot in space.
The whole thing could easily be seen as a gimmick, but NASA doesn't think it is. "When we take ATHLETE to an asteroid, we have to make it fly," said Norris. "We need to move it as if it were zero gravity... the demos we've seen are actually quite amazing. We can see this enormous rover lazily falling and bouncing off things, as if it were a cloud."
Luo continued, saying that this is just a small subset of what JPL is trying to do. From NASA's first ever Xbox Live game, Mars Rover Landing, to its use of consumer-grade hardware, it's clear the agency wants to get everyday citizens excited about space travel again. With gaming-inspired projects like this one, that goal certainly seems within reach. To get a glimpse of the software in action with zSpace and Leap Motion sensors, have a look at the NASA-provided video above. For the GDC panel, check out the audience-captured clip below. If you have any additional questions or comments, feel free to contact Luo and Norris via Twitter; their handles are @victorocks and @jeffreynorris respectively.