As promised a while back, we got to chat with Elliptic Labs here at CES, and CEO Stian Aldrin walked us through the touchless gesture technology his 15-person Norway-based company is developing as a prototype. The whole thing is based on ultrasound, it turns out -- a small speaker kicks out frequencies higher than the ear can hear, and a set of microphones listens in on the reflections, using an algorithim to calculate where your hand is at as you wave it through the air. The result is a gesture-based control system for touchscreen devices, but without the actual touch.
Aldrin told us that the system is already in use in a Norwegian hospital, where surgeons control touchscreen tablets without having to take their sanitized gloves off during surgery. Currently, the system only allows for a few simple gestures (swiping up and down, or left and right), but that's just a limitation of the demo units Elliptic Labs has created. Potentially, such a system could not only recognize the placement and speed of your hand passing by (and indeed, one of the demos in the CES booth could monitor both proximity to the screen and speed, flipping on-screen content faster if you pushed your hand by faster), but it could also calculate multiple points of movement, doing things like multi-touch gestures in the air.
You do have to be pretty close in to the screen to operate the device -- rather than a big cone like a Kinect, the system monitors a sphere around itself, so you've got to have your hand enter that sphere for it to register. But Elliptic (who already plan to be back at CES with an even bigger booth next year) suggests that the system could be used for lots of things, from quick music controls to car controls, or anything else where you need to make a touch-style gesture without actually touching the screen. We've got exclusive video after the break of Aldrin demoing a dock version of the system, connected via Wi-Fi to an off-the-shelf iPad running a custom-made app.