gesturecontrol

Latest

  • Kinect used to control Super Mario on a PC, redefine convergence (video)

    by 
    Vlad Savov
    Vlad Savov
    11.28.2010

    If, like us, you've been waiting to see Kinect in control of a truly marquee game, your wait has now come to an end. The same fella that brought us the Kinect lightsaber has returned with a hack enabling eager nostalgics to enjoy a bout of Super Mario controlled only by their body contortions. OpenKinect was used to get the motion-sensing peripheral -- originally intended exclusively for use with an Xbox 360 -- to communicate with his PC, while a simple NES emulator took care of bringing the 25-year old plumber to life. The video awaits after the break.

  • Microsoft buys Canesta, continues camera-based domination of our interfaces

    by 
    Sean Hollister
    Sean Hollister
    10.30.2010

    It seems that Microsoft's taken the camera to heart following its dismissal of the pen -- the company bought 3DV, collaborated with PrimeSense on Kinect, and today it's apparently finalized a deal to acquire 3D CMOS camera chipmaker Canesta as well. In case you've already forgotten, the latter company is the one that made an paid actor look particularly smug last year, by allowing the gent to control his television with a flick of the wrist. Things have progressed a good bit further than that, however, as you'll see in a demo video after the break, and Canesta president and CEO Jim Spare says he expects the company's stuffs to "see wide adoption across many applications that embody the full potential of the technology" under Microsoft's reign. Press release after the break.

  • Glowing Pathfinder Bugs installation puts the 'Minority Report' interface to good use - in a sand box (video)

    by 
    Joseph L. Flatley
    Joseph L. Flatley
    07.30.2010

    Nestled among the various booths at SIGGRAPH 2010 visitors got to check out a unique installation called Glowing Pathfinder Bugs. Created by Squidsoup and Anthony Rowe, this interactive art piece uses projectors to place "bugs" made out of light in a sandbox, coupled with a 3D gesture-based interface that allows people to pick up, move, and even breed the creatures. The system even takes the topography of the sand itself into consideration: altering the sand will alter the bug's path. It's nice to see someone put an interface technology to good use for a change! Video after the break.

  • Hitachi shows off new gesture-based interface, touts grand plans

    by 
    Donald Melanson
    Donald Melanson
    07.29.2010

    Hitachi's already dipped its toes (or hands, as it were) into the gesture-based waters before, but it looks to have refined things quite a bit for its latest Minority Report-esque demo, which the company is showing off as part of its 100th anniversary celebration. While complete details are a bit light, the system does seem to be reasonably responsive, and appears to rely on a projection-based system and a single camera to track movements. Perhaps what's most interesting, however, is that Hitachi eventually sees systems like this being used in everything from digital signage to medical applications -- and, yes, even TVs and desktop computers (though not before mid-2011 at the earliest). Head on past the break to check it out in action.

  • Fraunhofer FIT touch-free gesture-control for multiple users (video)

    by 
    Joseph L. Flatley
    Joseph L. Flatley
    07.21.2010

    It seems like everyone is cooking up their own touch-free gesture-based control technology, just like every blogger is destined to refer to it as "Minority Report-like" or "Minority Report-esque," or "Tom Cruise-tastic!" Fraunhofer's FIT, the newest such project, has recently appeared on the YouTubes, where we must say it looks pretty darn good. Not only does it not require special gloves or markers, this thing also works in real time and can support multiple users (and multiple fingers). The researchers hope to use this for working with complex simulation data and in education, although there are some kinks to be worked out: currently elements like the reflections caused by wristwatches and the orientation of the palm confuses the system. That said, the demo is pretty rad! See for yourself after the break.

  • Microsoft hints at touchless Surface combining camera and transparent OLED (video)

    by 
    Sean Hollister
    Sean Hollister
    06.29.2010

    We've always wondered whether Microsoft's multitouch table would actually ever arrive, dreaming of Minority Report hijinx all the while, but after seeing what the company's Applied Sciences Group is currently cooking up -- a touchless telepresence display -- we'd rather drop that antiquated pinch-to-zoom stuff in favor of what might be Surface's next generation. Starting with one of Samsung's prototype transparent OLED panels, Microsoft dropped a sub-two-inch camera behind the glass, creating a 3D gesture control interface that tracks your every move by literally seeing through the display. Combined with that proprietary wedge-shaped lens we saw earlier this month and some good ol' Johnny Chung Lee headtracking by the man himself, we're looking at one hell of a screen. Don't you dare read another word without seeing the prototype in a trifecta of videos after the break.

  • Kinect tech destined for TV-embedded greatness in 2011, HTPC integration later this year

    by 
    Sean Hollister
    Sean Hollister
    06.23.2010

    From Tel Aviv unknown to Xbox gaming wunderkind, PrimeSense has already had quite a run, but the camera-control tech that powers Kinect is destined for new applications before long. VP Adi Berenson tells us the company's already signed deals to put PrimeSense inside HTPCs by the end of the year, and has at least one cable company ready to launch a gesture-controlled set top box by summer 2011. The end goal is to provide natural human control over TV-based media consumption, said Berenson, who's working to get cameras in TVs themselves sometime late next year. Like Kinect, these solutions will have a pair of 640 x 480 camera sensors to measure user position in 3D space, but don't expect them to have motorized tilt functionality or voice recognition -- PrimeSense said it won't be able to make those available for manufacturers, as they're all Microsoft ideas. The gesture recognition has reportedly evolved, though, and we're eager to check that out soon. See what it used to look like in our GDC 2010 preview. Update: Just to be absolutely clear, this is not Microsoft's Kinect that's slated for an HTPC and set-top-box near you, but rather PrimeSense, the 3D camera sensor technology behind it.

  • Fujitsu's motion sensing laptop interface makes no sense (video)

    by 
    Thomas Ricker
    Thomas Ricker
    06.09.2010

    We're not sure what Fujitsu is thinking here, but it has to stop. Get a load of its motion control interface running on a 15.6-inch laptop. Yes, a 15-inch laptop. We might be able to understand this if it was plugged into a big flat panel television or projector, but trying to manipulate those itty bitty controls from 10-feet away is, well, silly. The Core i3-350M-powered Fujitsu LifeBook AH700/5A does feature HDMI-out but you still have to place the laptop in front of you (and the TV) with the display popped open so that the camera can see your movements. On a positive note, it looks like a great way to develop your wax-on / wax-off ninja tuna skills.

  • EyeSight's hand-waving, gesture-based UI now available for Android (video)

    by 
    Tim Stevens
    Tim Stevens
    06.08.2010

    Sure, the Evo's front-facing camera enables you to call your snookums and let them see your mug while you two exchange sweet nothings. But, wouldn't it be much better if you could tell your phone to talk to the hand? Now it can... at least in theory, with the availability eyeSight libraries for Android. EyeSight's Natural User Interface relies on a phone's camera to detect hand motions, enabling developers to write apps that change tracks, ignore callers, and display text messages with a wave. The down-side is that those apps need to be specifically written to work in this way, and while the libraries have been available for Nokia handsets since last year, right now we're seeing a whopping four programs that use it (including the hugely important "Fart Control," which turns your phone into a "motion detecting fart machine"). So, you should probably not expect a revolution here either. Video demo from the Nokia days is embedded just below.

  • Minority Report UI designer demos his tech at TED (video)

    by 
    Sean Hollister
    Sean Hollister
    06.03.2010

    In February 2010, the man who built the technology of Minority Report twice -- once for the movie, and once in real life -- spoke at TED about the future of user interface design. Yesterday, TED posted John Underkoffler's entire fifteen-minute video presentation -- a copy of which you'll find right after the break. Get a curated glimpse into his company's tech in the following demo, and hear from the man himself when the gloves might come off. And if that doesn't satisfy your appetite, read an in-depth interview with Underkoffler at our more coverage link.

  • Microsoft Research toys with the cosmos... using forefinger and thumb (video)

    by 
    Sean Hollister
    Sean Hollister
    05.31.2010

    We've always been suckers for Minority Report tech, and Microsoft Research's latest attempt is not to be missed. Thought pinch-to-zoom was quaint? Try pinching the sky in this geodesic dome. Though the cardboard-and-paper-clip structure isn't all that (unless you're the arts and crafts type), the inside houses a projectiondesign DLP unit with a custom infrared camera system that can turn simple hand gestures into virtual interstellar travel, 360-degree video teleconferencing and more. You'll find a pair of videos demonstrating the concept after the break, but try not to get too attached -- if you're anything like us, your poor heart can't handle another Courier axing.

  • Toshiba AirSwing UI puts you on the screen with your data

    by 
    Joseph L. Flatley
    Joseph L. Flatley
    05.28.2010

    We've seen a Minority Report-esque interface or two hundred by this point, but Toshiba's AirSwing really caught our attention. Using little more than a webcam and some software, this bad boy places a semi-transparent image of the operator on the display -- all the easier to maneuver through the menus. And according to Toshiba, that software only utilizes about three percent of a 400MHz ARM 11 CPU -- meaning that you have plenty of processor left for running your pre-crime diagnostics. There is no telling when something like this might become commercially available, but the company plans to bundle it in commercial displays for malls and the like. Video after the break.

  • MIT researchers develop the most fabulous gesture control technique yet

    by 
    Joseph L. Flatley
    Joseph L. Flatley
    05.23.2010

    When looking for a cheap, reliable way to track gestures, Robert Wang and Jovan Popovic of MIT's Computer Science and Artificial Intelligence Laboratory came upon this notion: why not paint the operator's hands (or better yet, his Lycra gloves) in a manner that will allow the computer to differentiate between different parts of the hand, and differentiate between the hand and the background? Starting with something that Howie Mandel might have worn in the 80s, the researchers are able to use a simple webcam to track the hands' locations and gestures -- with relatively little lag. The glove itself is split into twenty patches made up of ten different colors, and while there's no telling when this technology will be available for consumers, something tells us that when it does become available it'll be very hard not to notice. Video after the break. Update: Just received a nice letter from Rob Wang, who points out that his website is the place to see more videos, get more info, and -- if you're lucky -- one day download the APIs so you can try it yourself. What are you waiting for?

  • flOw wireless speaker concept flips jams based on your handling

    by 
    Darren Murph
    Darren Murph
    05.23.2010

    She's but a concept at the moment, but David Boyce's flOw mockup certainly has legs. The five-speaker set can be arranged in a fanciful iPod docking station for in-home use, and on a whim, any of the speaker balls can be grabbed and taken elsewhere, all while the music follows via a touch of wireless magic. But that's hardly the kicker -- each ball has integrated gesture sensors and accelerometers, enabling the user to quiet the volume by turning it over or switch from "Smooth Operator" to "Master of Puppets" by simply jostling the speaker with an intense level of force. Talk about revolutionizing a played market sector. [Thanks, Paul]

  • Evoluce 47-inch HD multitouch display gets off-screen gesture control

    by 
    Joseph L. Flatley
    Joseph L. Flatley
    05.13.2010

    Evoluce, the manufacturers of that mammoth 47-inch full HD touchscreen, are out of control! Apparently, they've decided that unlimited simultaneous touch inputs (and thus unlimited simultaneous phalanges) was not enough, so they've gone an' added gesture support -- up to half a meter from the device. Apparently this bad boy supports Windows 7, although if you want your interface du jour to put the "unlimited" in "multitouch" you'll most likely have to roll your own. Interested? Wealthy? Check out some righteous video and PR after the break.

  • Kogan's latest fantasy product? An IPTV with Chrome browser, WiFi & gaming

    by 
    Richard Lawler
    Richard Lawler
    04.10.2010

    The always optimistic Kogan isn't just aiming at tablets, it's posted a video showing off a new IPTV that could do what the big boys won't when (if) it ships later this year. These Chinese built HDTVs feature WiFi and video on-demand widgets similar to what everyone else is doing, but things start to get interesting in this YouTube video (embedded after the break) around the 1:05 mark when he pulls up a Chrome browser window and navigates the internet easily using the remote, as well as testing out some gesture controlled games and a Bluetooth keyboard. More than a few no-name builders had displays with similar capabilities up and running at CES so it's easy to see someone shipping one or two of these but with Kogan's track record (what happened to those OLED TVs?) it's difficult to disguise our doubt it will be among them. True believers are invited to post suggestions for other features on the company blog -- we're thinking a dedicated Engadget Show channel is just what the doctor ordered.

  • iPad apps: defining experiences from the first wave

    by 
    Sean Hollister
    Sean Hollister
    04.02.2010

    There are now over 1,348 approved apps for the iPad. That's on top of the 150,000 iPad-compatible iPhone programs already available in the App Store. When Apple's tablet PC launches, just hours from now, it will have a software library greater than that of any handheld in history -- not counting the occasional UMPC. That said, the vast majority of even those 1,348 iPad apps are not original. They were designed for the iPhone, a device with a comparatively pokey processor and a tiny screen, and most have just been tweaked slightly, upped in price and given an "HD" suffix -- as if that somehow justified the increased cost. Besides, we've seen the amazing potential programs have on iPhone, Android, Blackberry, Windows Mobile and webOS when given access to a touchscreen, always-on data connection, GPS, cloud storage and WiFi -- but where are the apps that truly define iPad? What will take advantage of its extra headroom, new UI paradigms and multitouch real estate? Caught between netbook and smartphone, what does the iPad do that the iPhone cannot? After spending hours digging through the web and new iPad section of the App Store, we believe we have a number of reasonably compelling answers. Update: Now includes Wormhole Remote, TweetDeck, SkyGrid, Touchgrind HD, GoToMeeting, SplitBrowser, iDisplay, Geometry Wars and Drawing Pad.

  • AiLive shows off its LiveMove 2 software for building MotionPlus and PlayStation Move gestures

    by 
    Paul Miller
    Paul Miller
    03.19.2010

    If you've been following closely, there are really two sorts of input available to the PlayStation Move. The one that gets the most love and screen time is the camera-based, 3D meatspace tracking that the PlayStation Eye performs in conjunction with the fancy colored ball at the end of the PlayStation Move wand, but most of the actual gameplay we've seen is in truth much more similar to the Wii's MotionPlus than Sony might want to let on. The MotionPlus and PS Move have very similar configurations of gyroscopes and accelerometers, and actually use the same software from AiLive (co-creators of MotionPlus) for developing the gesture recognition that goes into games. We actually got to see the LiveMove 2 development environment in action, and it's pretty impressive: basically you tell a computer what gesture you want to perform (like "fist pump," for instance) and then perform a bunch of examples of that movement. LiveMove then figures out the range of allowable movement, and in playback mode shows you whether you're hitting the mark. AiLive showed us gestures as complicated as a Graffiti (of Palm OS yore) handwriting recognition in the air, built with just a few example movements from people back at their offices. So, this is great news for developers dealing with the significant complication of all these sensors, but at the same time we can't help but be a little disappointed. LiveMove 2 doesn't even use the PlayStation Eye, and as we mentioned in our hands-on impressions of PlayStation Move, we could really sense that a lot of our in-game actions were built from predefined gestures, not us interacting with the 3D environment in any "real" or physics-based way. It's great tech either way, but hopefully that's something that can be improved upon by launch or soon after. Check out a demo of LiveMove in action after the break.

  • Fujitsu's Air Command Plus guides PowerPoint, not B-52s (video)

    by 
    Tim Stevens
    Tim Stevens
    02.13.2010

    Multitouch is great and all, but what if you can't reach the screen? What if you could touch without touching? That's the idea behind Fujitsu's Air Command Plus, a device that promises a Minority Report-like experience but, after watching the video below, it sure seems like pretty standard gesture control. You can browse through pictures by flicking left or right, adjust a volume dial by rotating, and navigate PowerPoint slides as if you were a master of the black arts. But there's nothing metaphysical about it, and it's destined to get a lot more real in March when Fujitsu is actually pledging to ship the thing. Eat your heart out, Tom Cruise. [Thanks, Hanco]

  • Project Natal makes a Smallville cameo, does not guarantee ability to fly*

    by 
    Richard Lawler
    Richard Lawler
    02.12.2010

    We didn't get any more details about Microsoft's Project Natal add-on for Xbox 360 from the X10 event, but tonight Smallville viewers got to watch someone else whipping their arms and legs around trying to catch imaginary balls flying out of their TV screen. Short of some actual time gesticulating wildly in front of that motion sensing cam this is the closest thing we've got to actually using it, but as the clip (embedded after the break) shows, girls are likely to be more impressed by someone that can fly. %Gallery-85394%