Speaking of applications, PrimeSense wasn't able to talk specifics about partnerships (aside from that CyberLink integration announced at CES), but they did say that they're targetting a few areas in particular: home control, boardrooms, presenters and gaming -- and that's just to name a few. In the demo hosted up just past the break, you'll notice that the solution recognizes extremely subtle hand movements, and once it locks on to a user, it won't be bothered with other hand motions from other members in the party until you make the gesture to renounce control; from there, another person can gain control by simply pushing their arm in the direction of the sensor. We even walked in front of the sensor while a user was controlling the screen, and it never lost connection or had any issues with our interfering. The reference box also had a pair of speakers (up to six are supported), and we were told that developers could customize software to recognize sound inputs as well as body movements if they wanted.
One of the most obvious uses here would be to manage one's television; channel changing with hand movements would make quite a few couch potatoes happy, and we could definitely see programs such as Windows Media Center taking advantage of hand swipes as a way to navigate through titles. There's also the traveling salesperson, who could probably wow a few clients by being able to navigate through a presentation by simply moving his or her hands. In theory, at least, this could definitely be used in console or PC gaming (again, think EyeToy
if you're having a hard time imagining), but given the amount of possibilities here, we get the feeling that replacing the tried-and-true controller is the least of PrimeSense's worries. Have a peek at the video below to get an idea of how this would work in a living room setting, and feel free to let your imagination run wild soon after. We sure did. See more video at our hub!