Advertisement

PrimeSense talks full-body motion control at GDC, gives us a video demonstration

PrimeSense was formed in 2005, and unless you're a sickly obsessed silicon junkie, you've probably never heard of them. All that changes today. We sat down with the company at GDC to learn more about the chip that it produces, and we left with an imagination sore from being stretched so severely. Put simply, the company manufacturers a microchip that, when paired with off-the-shelf optics, can create a 3D grid that a computer can understand. The purpose here, as you can likely glean, is to enable PlayStation Eye-like interactions, or as the company suggests, a "more natural" way to interface with devices you use every day. Rather than grabbing the remote to switch channels or snapping up that HTPC keyboard in order to flip through your stored DVD library, PrimeSense would rather you kick back on the sofa and gently flick your hands in order to turn to this week's Gossip Girl or sort through those classic horror flicks. %Gallery-87985%

It's important to remember that PrimeSense isn't in the business of creating hardware, but today we were shown a reference design that looks an awful lot like an enlarged webcam. The device is completely USB powered, and while the unit shown in the images and video here was obviously a standalone device, we were told that it would be possible to integrate the solution into displays and the like in the future. They also mentioned that the depth location -- which enables it to map out a room and detect your entire body -- was done on-chip, with only the associated middleware taxing the CPU. Still, they've had success running this on Atom-level processors, so there's certainly no big horsepower hang-up preventing it from hitting up a variety of markets.

More after the break...
%Gallery-87979%


Speaking of applications, PrimeSense wasn't able to talk specifics about partnerships (aside from that CyberLink integration announced at CES), but they did say that they're targetting a few areas in particular: home control, boardrooms, presenters and gaming -- and that's just to name a few. In the demo hosted up just past the break, you'll notice that the solution recognizes extremely subtle hand movements, and once it locks on to a user, it won't be bothered with other hand motions from other members in the party until you make the gesture to renounce control; from there, another person can gain control by simply pushing their arm in the direction of the sensor. We even walked in front of the sensor while a user was controlling the screen, and it never lost connection or had any issues with our interfering. The reference box also had a pair of speakers (up to six are supported), and we were told that developers could customize software to recognize sound inputs as well as body movements if they wanted.



One of the most obvious uses here would be to manage one's television; channel changing with hand movements would make quite a few couch potatoes happy, and we could definitely see programs such as Windows Media Center taking advantage of hand swipes as a way to navigate through titles. There's also the traveling salesperson, who could probably wow a few clients by being able to navigate through a presentation by simply moving his or her hands. In theory, at least, this could definitely be used in console or PC gaming (again, think EyeToy if you're having a hard time imagining), but given the amount of possibilities here, we get the feeling that replacing the tried-and-true controller is the least of PrimeSense's worries. Have a peek at the video below to get an idea of how this would work in a living room setting, and feel free to let your imagination run wild soon after. We sure did.

See more video at our hub!