Advertisement

iPad as Kinect part 2: Air Touches

What can you do with an iPad 2 with its onboard cameras and it's fairly able CPU? Quite a lot, it turns out. When TUAW first looked at Greg Hartstein's Air Touch project, it required a completely dark room and could only handle a single measure of distance based on lighting levels.

Things have evolved since then. Hartstein provided this video that shows how using a tracking element attached to his hand (you can see it slightly in the video when his hand is at an angle to the camera) simplifies the interaction and provides a far greater range of possible interactions. Unfortunately, the YouTube video is slightly glitchy (the original version we received is not, but when we uploaded it to YouTube we got that weird pixelation in the beginning).

In addition to tracking distance and rotation, Hartstein has also been working on air-swipes and other motion-based gestures.