Gesture Control

Latest

  • Interactive storefront displays show up at Canadian Starbucks, window licking discouraged

    by 
    Michael Gorman
    Michael Gorman
    02.07.2011

    Starbucks has given the caffeinated crowd a new reason, other than the free WiFi, to stop by a couple of locations in Toronto and Vancouver -- interactive window displays! Taking sidewalk passers-by on a journey to assemble their favorite Tazo teas, the interactivity comes via a vinyl screen, projector, and gesture controls. We've already seen an interactive storefront in the US, so its about time our friends up north got some geekified advertising of their own. Vid's after the break.

  • ASUS Wavi Xtion motion sensing control system demoed at CES (video)

    by 
    Darren Murph
    Darren Murph
    01.11.2011

    ASUS may not be anywhere close to ready for its Wavi Xtion to hit retail shelves (we're hearing Q2 of 2012), but that didn't stop our brethren over at Engadget Spanish from stopping by for a hands-on demonstration at CES. We'll spare you the details on how it works, but in practice, we learned that it's quite similar to Kinect. Not shocking considering that PrimeSense is behind both boxes, but the primarily difference seemed to be the reaction time. ASUS' solution wasn't quite as snappy as the Kinect, being slower to recognize and translate motions in testing. Of course, we wouldn't expect a product that's 18 months out from mass production to be completely on top of its game, but feel free to head on past the break to see exactly what we mean.

  • Philips uWand hands-on

    by 
    Vlad Savov
    Vlad Savov
    01.09.2011

    Philips has been touting its uWand "direct pointer" remote for a good while now, but we've never had the chance to actually use one ourselves. Today that omission has been rectified as we got our mitts around this motion / gesture-based controller and gave it a quick test drive. It works by having an infrared camera embedded in the front, which detects an IR beacon in your TV and thereby judges its own distance, tilt and relation to the TV. That then allows for things like motion-based zooming, pointing the remote at the particular thing on the television that you want to "click" on, and navigational shortcuts attached to gestures performed with this handheld. There's also a nice spacious keyboard on the back, assisting in the use of all these Smart TVs CES has been littered with. Philips' intention is to license the technology out to other manufacturers, which could result in consumer products by 2012 -- so yeah, it's not terribly close to your living room yet, but our gallery of images is. %Gallery-113562%

  • LG's 2011 Smart TVs focus on easy, instant access

    by 
    Richard Lawler
    Richard Lawler
    01.03.2011

    We already got a peek at LG's Smart TV Upgrader box, but now it's revealing a bit more info about why it thinks anyone will be opting for their app-laden displays this year. Deriding the QWERTY remotes offered by competitors, LG's big idea is a simple dashboard with four elements, controlled by its Magic Motion gesture control remote. We'll still need to wait for some hands on time with its DLNA sharing and web browser to see if the simple life is the way, but if you were freaked out by all the buttons on Sony's Google TV remote then your sometime is now.

  • PrimeSense and ASUS team, bring Kinect-like Wavi Xtion to your living room TV (update)

    by 
    Sean Hollister
    Sean Hollister
    01.03.2011

    PrimeSense provides some of the brains behind Microsoft's Kinect, and wants a bigger piece of the pie; ASUS has a reputation for announcing wonderfully wacky peripherals every year. At CES 2011, the Wavi Xtion will check off both boxes nicely. In a nutshell, the Xtion is a PrimeSense 3D depth camera built exclusively for PC, but with an important twist -- it connects to a pair of ASUS Wavi boxes, which wirelessly streams its data to your living room PC between your TV and a Windows PC over the 5GHz band. Oh, and should ASUS attract enough developers, it will even pull down applications from an Xtion online store. ASUS says we'll see the package commercially available around the world in Q2 of next year -- with a UI and selection of apps and games on board -- but they'll release an Xtion PRO developer kit in February to tempt all you Kinect hackers into coding magical things for the platform. No more details for now, but there's an event in Vegas this week where ASUS is all but guaranteed to show it off. PR after the break. Update: Did we say HTPC? Turns out it doesn't quite work that way -- the Wavi are actually a pair of boxes that wirelessly sling data between them. You put the Xtion sensor on top of your TV, connect it to Wavi #1, then plug Wavi #2 into a PC up to 25 meters away. Mind you, it looks like the Xtion may not be quite as capable as Microsoft's unit, as there's only infrared hardware inside -- it might be fine for gesture control, but don't expect any augmented reality lightsaber fights. See some mockups below! %Gallery-112375%

  • Elliptic Labs set to save your iPad from smudges with 3D gesture-sensing dock (video)

    by 
    Tim Stevens
    Tim Stevens
    12.22.2010

    The dream of kitchen computing still isn't here, with many chefs forced to read from archaic paper-based recipe lists or, worse yet, memorize the things. Maybe all we need is a way to interact with our gadgets without getting them all messy, and maybe Elliptic Labs can get us there. Finally. The company has been teasing us with its 3D gesture interface for years now and it looks set to finally show off an actual product, a motion-sensing iPad dock prototype making its debut at CES in a few weeks. The idea is you perch this sucker in your kitchen and it gives you full control whether you're kneading sourdough or mixing meatballs, keeping your tablet streak-free -- and hygienic. That seems like somewhat limited usefulness to us, but check out the video of an earlier prototype below and see if it doesn't make you want to bake some cookies. And, if it does, feel free to bring us some.

  • Kinect admits itself to hospital, treated for gesture control of medical images

    by 
    Darren Murph
    Darren Murph
    12.21.2010

    At this point, we all have a serious question to ask: is there anything the Kinect can't do? While Microsoft has managed to move quite a few of the camera-laden tubes, a good amount of 'em have been put to use in applications not named gaming. Take the Virtopsy Project, for instance. This particular setup uses the Kinect camera bar to control a PACS system (OsiriX, in this case), and it relies on software based on "ofxKinect, libfreenect and open frameworks." That's a lot of technobabble for sure, but as they say, the proof is in the YouTube video. Ever dreamed of swirling medical images around with hand gestures? Head on down and mash play -- fantasyland awaits.

  • Microsoft seeking to quadruple Kinect accuracy?

    by 
    Sean Hollister
    Sean Hollister
    12.19.2010

    Hacked your Kinect recently? Then you probably know something most regular Xbox 360 gamers don't -- namely, that the Kinect's infrared camera is actually capable of higher resolution than the game console itself supports. Though Microsoft originally told us it ran at 320 x 240, you'll find both color and depth cameras display 640 x 480 images if you hook the peripheral up to a PC, and now an anonymous source tells Eurogamer that Microsoft wants to do the very same in the video game space. Reportedly, Redmond artificially limited the Kinect on console in order to leave room for other USB peripherals to run at the same time, but if the company can find a way around the limitation, it could issue a firmware update that could make the Kinect sensitive enough to detect individual finger motions and inevitably lead to gesture control. One of multiple ways Microsoft intends to make the world of Minority Report a reality, we're sure.

  • Gesture-controlled robot arm enables civilization's most meta high five

    by 
    Paul Miller
    Paul Miller
    12.15.2010

    This video, criminally, doesn't actually show any high fives, but we're sure the students at the University of Tsukuba have sustained endless LOLs over the past few months, pushing their gesture-driven robot arm system to the limits of human-robot high five interaction. The system itself is relatively simple: it uses two cameras to track a hand's movements, including specific finger gestures, which are then processed and translated into robotic movement in real time. The end result is basically the world's most elaborate claw machine game, as demonstrated above.

  • Kinect Hacks Daily, Episode 47: Kinect taught to control XBMC through hand gestures

    by 
    Paul Miller
    Paul Miller
    12.15.2010

    One of our favorite parts of Kinect, at least theoretically, was the idea of controller-free and remote-free control of the dashboard and media playbook. Sure, it's never going to be as optimized and snappy as those tried-and-true digital buttons, but it's a great party trick, and we're all about the party tricks. Well, now you can get some of that gesture mojo going on with your XBMC setup -- and we're guessing eventually you'll be able to control just about anything else you'd use basic "left, right, click" actions for navigation. Our only suggestion? Get some of that Dance Central-style menu navigation going on here. That goes for you, too, Microsoft. [Thanks, Joshua]

  • Kinect finally fulfills its Minority Report destiny (video)

    by 
    Vlad Savov
    Vlad Savov
    12.09.2010

    Not to denigrate the numerous fine hacks that Kinect's undergone since its launch, but it's always nice to see the professionals come in and shake things up a little. A crew from MIT's brain labs has put together a hand detection system on Microsoft's ultra-versatile cam, which is sophisticated enough to recognize the position of both your palms and fingers. Just as a demonstration, they've tied that good stuff up to a little picture-scrolling UI, and you won't be surprised to hear that it's the closest thing to Minority Report's interactive gesture-based interface that we've seen yet. And it's all achieved with a freaking console peripheral. Video after the break.

  • Kinect used to control Super Mario on a PC, redefine convergence (video)

    by 
    Vlad Savov
    Vlad Savov
    11.28.2010

    If, like us, you've been waiting to see Kinect in control of a truly marquee game, your wait has now come to an end. The same fella that brought us the Kinect lightsaber has returned with a hack enabling eager nostalgics to enjoy a bout of Super Mario controlled only by their body contortions. OpenKinect was used to get the motion-sensing peripheral -- originally intended exclusively for use with an Xbox 360 -- to communicate with his PC, while a simple NES emulator took care of bringing the 25-year old plumber to life. The video awaits after the break.

  • Microsoft buys Canesta, continues camera-based domination of our interfaces

    by 
    Sean Hollister
    Sean Hollister
    10.30.2010

    It seems that Microsoft's taken the camera to heart following its dismissal of the pen -- the company bought 3DV, collaborated with PrimeSense on Kinect, and today it's apparently finalized a deal to acquire 3D CMOS camera chipmaker Canesta as well. In case you've already forgotten, the latter company is the one that made an paid actor look particularly smug last year, by allowing the gent to control his television with a flick of the wrist. Things have progressed a good bit further than that, however, as you'll see in a demo video after the break, and Canesta president and CEO Jim Spare says he expects the company's stuffs to "see wide adoption across many applications that embody the full potential of the technology" under Microsoft's reign. Press release after the break.

  • Glowing Pathfinder Bugs installation puts the 'Minority Report' interface to good use - in a sand box (video)

    by 
    Joseph L. Flatley
    Joseph L. Flatley
    07.30.2010

    Nestled among the various booths at SIGGRAPH 2010 visitors got to check out a unique installation called Glowing Pathfinder Bugs. Created by Squidsoup and Anthony Rowe, this interactive art piece uses projectors to place "bugs" made out of light in a sandbox, coupled with a 3D gesture-based interface that allows people to pick up, move, and even breed the creatures. The system even takes the topography of the sand itself into consideration: altering the sand will alter the bug's path. It's nice to see someone put an interface technology to good use for a change! Video after the break.

  • Hitachi shows off new gesture-based interface, touts grand plans

    by 
    Donald Melanson
    Donald Melanson
    07.29.2010

    Hitachi's already dipped its toes (or hands, as it were) into the gesture-based waters before, but it looks to have refined things quite a bit for its latest Minority Report-esque demo, which the company is showing off as part of its 100th anniversary celebration. While complete details are a bit light, the system does seem to be reasonably responsive, and appears to rely on a projection-based system and a single camera to track movements. Perhaps what's most interesting, however, is that Hitachi eventually sees systems like this being used in everything from digital signage to medical applications -- and, yes, even TVs and desktop computers (though not before mid-2011 at the earliest). Head on past the break to check it out in action.

  • Fraunhofer FIT touch-free gesture-control for multiple users (video)

    by 
    Joseph L. Flatley
    Joseph L. Flatley
    07.21.2010

    It seems like everyone is cooking up their own touch-free gesture-based control technology, just like every blogger is destined to refer to it as "Minority Report-like" or "Minority Report-esque," or "Tom Cruise-tastic!" Fraunhofer's FIT, the newest such project, has recently appeared on the YouTubes, where we must say it looks pretty darn good. Not only does it not require special gloves or markers, this thing also works in real time and can support multiple users (and multiple fingers). The researchers hope to use this for working with complex simulation data and in education, although there are some kinks to be worked out: currently elements like the reflections caused by wristwatches and the orientation of the palm confuses the system. That said, the demo is pretty rad! See for yourself after the break.

  • Microsoft hints at touchless Surface combining camera and transparent OLED (video)

    by 
    Sean Hollister
    Sean Hollister
    06.29.2010

    We've always wondered whether Microsoft's multitouch table would actually ever arrive, dreaming of Minority Report hijinx all the while, but after seeing what the company's Applied Sciences Group is currently cooking up -- a touchless telepresence display -- we'd rather drop that antiquated pinch-to-zoom stuff in favor of what might be Surface's next generation. Starting with one of Samsung's prototype transparent OLED panels, Microsoft dropped a sub-two-inch camera behind the glass, creating a 3D gesture control interface that tracks your every move by literally seeing through the display. Combined with that proprietary wedge-shaped lens we saw earlier this month and some good ol' Johnny Chung Lee headtracking by the man himself, we're looking at one hell of a screen. Don't you dare read another word without seeing the prototype in a trifecta of videos after the break.

  • Kinect tech destined for TV-embedded greatness in 2011, HTPC integration later this year

    by 
    Sean Hollister
    Sean Hollister
    06.23.2010

    From Tel Aviv unknown to Xbox gaming wunderkind, PrimeSense has already had quite a run, but the camera-control tech that powers Kinect is destined for new applications before long. VP Adi Berenson tells us the company's already signed deals to put PrimeSense inside HTPCs by the end of the year, and has at least one cable company ready to launch a gesture-controlled set top box by summer 2011. The end goal is to provide natural human control over TV-based media consumption, said Berenson, who's working to get cameras in TVs themselves sometime late next year. Like Kinect, these solutions will have a pair of 640 x 480 camera sensors to measure user position in 3D space, but don't expect them to have motorized tilt functionality or voice recognition -- PrimeSense said it won't be able to make those available for manufacturers, as they're all Microsoft ideas. The gesture recognition has reportedly evolved, though, and we're eager to check that out soon. See what it used to look like in our GDC 2010 preview. Update: Just to be absolutely clear, this is not Microsoft's Kinect that's slated for an HTPC and set-top-box near you, but rather PrimeSense, the 3D camera sensor technology behind it.

  • Fujitsu's motion sensing laptop interface makes no sense (video)

    by 
    Thomas Ricker
    Thomas Ricker
    06.09.2010

    We're not sure what Fujitsu is thinking here, but it has to stop. Get a load of its motion control interface running on a 15.6-inch laptop. Yes, a 15-inch laptop. We might be able to understand this if it was plugged into a big flat panel television or projector, but trying to manipulate those itty bitty controls from 10-feet away is, well, silly. The Core i3-350M-powered Fujitsu LifeBook AH700/5A does feature HDMI-out but you still have to place the laptop in front of you (and the TV) with the display popped open so that the camera can see your movements. On a positive note, it looks like a great way to develop your wax-on / wax-off ninja tuna skills.

  • EyeSight's hand-waving, gesture-based UI now available for Android (video)

    by 
    Tim Stevens
    Tim Stevens
    06.08.2010

    Sure, the Evo's front-facing camera enables you to call your snookums and let them see your mug while you two exchange sweet nothings. But, wouldn't it be much better if you could tell your phone to talk to the hand? Now it can... at least in theory, with the availability eyeSight libraries for Android. EyeSight's Natural User Interface relies on a phone's camera to detect hand motions, enabling developers to write apps that change tracks, ignore callers, and display text messages with a wave. The down-side is that those apps need to be specifically written to work in this way, and while the libraries have been available for Nokia handsets since last year, right now we're seeing a whopping four programs that use it (including the hugely important "Fart Control," which turns your phone into a "motion detecting fart machine"). So, you should probably not expect a revolution here either. Video demo from the Nokia days is embedded just below.