gesture

Latest

  • iPhone rumor two-pack: multitasking gestures and MobileMe Photo Streams? (Update: those pics are real)

    by 
    Chris Ziegler
    Chris Ziegler
    01.19.2011

    Looks like the iPad might not have an exclusive on those new "multitasking" gestures unearthed in the latest iOS 4.3 beta, because BGR's got some shots up of an purported internal build that seem to indicate Cupertino intends to push them down to the iPhone line as well. Now, there are a couple of obvious red flags here -- using "four or five fingers" sounds pretty unreasonable for a 3.5-inch display, though it'd certainly tie in with recent rumors that Apple's turning sour on the physical Home button and might indicate that the complete redesign we're hearing about could include a slightly larger display. Of course, it could also indicate that this is really early software with the wrong (read: iPad) wording -- but at any rate, we could definitely see the benefit of, say, a two-finger gesture to swipe between apps. Now, on to part two: 9to5 Mac seems to have unearthed traces in the iOS 4.3 beta of a new feature called "Photo Streams" that seems to be -- you guessed it -- a way to share continuous streams of photos with friends you approve, presumably through MobileMe (which would fit in nicely with the Find my Friends stuff). Alone, that's not worth $99 a year -- plenty of other services offer similar functionality -- but we wouldn't be surprised if this were bundled in with a bunch of MobileMe refreshes this year. Update: We just received an interesting tidbit of information on the above pic (and others that BGR has its hands on). According to sources, after the iPhone 4 barroom debacle, Apple made significant changes to how it keeps track of -- and identifies to third parties -- its devices. Namely, the company began adding clauses to screens on the phones which read "Confidential and Proprietary, if found, please contact..." followed by a 408 number (that's Cupertino, of course). This prevents any misunderstanding from parties who may come across these devices. So what does it all mean? Well, not much, save that it seems these photos we're viewing are in fact the real deal... which means gestures may be headed to your iPhone. Intrigue!

  • Philips uWand hands-on

    by 
    Vlad Savov
    Vlad Savov
    01.09.2011

    Philips has been touting its uWand "direct pointer" remote for a good while now, but we've never had the chance to actually use one ourselves. Today that omission has been rectified as we got our mitts around this motion / gesture-based controller and gave it a quick test drive. It works by having an infrared camera embedded in the front, which detects an IR beacon in your TV and thereby judges its own distance, tilt and relation to the TV. That then allows for things like motion-based zooming, pointing the remote at the particular thing on the television that you want to "click" on, and navigational shortcuts attached to gestures performed with this handheld. There's also a nice spacious keyboard on the back, assisting in the use of all these Smart TVs CES has been littered with. Philips' intention is to license the technology out to other manufacturers, which could result in consumer products by 2012 -- so yeah, it's not terribly close to your living room yet, but our gallery of images is. %Gallery-113562%

  • Elliptic Labs demonstrates its touchless user interface for iPad (with video)

    by 
    Mike Schramm
    Mike Schramm
    01.09.2011

    As promised a while back, we got to chat with Elliptic Labs here at CES, and CEO Stian Aldrin walked us through the touchless gesture technology his 15-person Norway-based company is developing as a prototype. The whole thing is based on ultrasound, it turns out -- a small speaker kicks out frequencies higher than the ear can hear, and a set of microphones listens in on the reflections, using an algorithim to calculate where your hand is at as you wave it through the air. The result is a gesture-based control system for touchscreen devices, but without the actual touch. Aldrin told us that the system is already in use in a Norwegian hospital, where surgeons control touchscreen tablets without having to take their sanitized gloves off during surgery. Currently, the system only allows for a few simple gestures (swiping up and down, or left and right), but that's just a limitation of the demo units Elliptic Labs has created. Potentially, such a system could not only recognize the placement and speed of your hand passing by (and indeed, one of the demos in the CES booth could monitor both proximity to the screen and speed, flipping on-screen content faster if you pushed your hand by faster), but it could also calculate multiple points of movement, doing things like multi-touch gestures in the air. You do have to be pretty close in to the screen to operate the device -- rather than a big cone like a Kinect, the system monitors a sphere around itself, so you've got to have your hand enter that sphere for it to register. But Elliptic (who already plan to be back at CES with an even bigger booth next year) suggests that the system could be used for lots of things, from quick music controls to car controls, or anything else where you need to make a touch-style gesture without actually touching the screen. We've got exclusive video after the break of Aldrin demoing a dock version of the system, connected via Wi-Fi to an off-the-shelf iPad running a custom-made app.

  • Movea SmartMotion Air Mini Keyboard remote and Air Mouse revealed alongside Sunrex partnership

    by 
    Darren Murph
    Darren Murph
    01.08.2011

    Don't quote us on this, but we've got a feeling that remotes will be more than just remotes by the time 2012 rolls around. HDTV companies have been slyly adding motion support to their remotes here at CES, and with PrimeSense's technology going over so well in the Kinect, there's an obvious next-step when looking at TV control. Movea -- the company responsible for the Gyration Air Mouse and Air Mouse iOS app -- has just teamed up with Sunrex here at CES, with the newly formed relationship expecting to yield new kit based on the former's MotionIC platform and SmartMotion technology. We're told to expect the first products to be available in Q1 of this year, with the SmartMotion Air Mini Keyboard remote and SmartMotion Air Mouse being named in particular. The former includes a full four-row QWERTY keyboard and relies on 2.4GHz wireless technology, but no images, pricing and availability details are available just yet. %Gallery-113688%

  • Apple patent application suggests yet more possible gestures for iPods

    by 
    Donald Melanson
    Donald Melanson
    01.06.2011

    Apple's already put some basic gesture controls to use on its sixth generation iPod nano, but a recently published patent application suggests that it may have some grander designs for a no-look interface of sorts. As you can see above, Apple's using a nano in its illustrations for the patent, but the actual claims suggest that the gestures wouldn't necessarily require a screen at all -- possibly for something similar to that back-side interface that also turned up in an Apple patent application? As for the gestures themselves, they'd apparently involve things like a single tap to pause or play, a double tap to skip forward, a triple tap to skip back, and a circular motion to control the volume. Of course, that aforementioned patent application was published in 2007 and we've yet to see anything result from it, so you may not want to hold your breath for this one either.

  • Hillcrest Labs intros embedded motion control system for TVs, hopes for the best

    by 
    Darren Murph
    Darren Murph
    01.05.2011

    PrimeSense had a feeling this kind of revolution would take over the television world when we spoke with them last March at GDC, and sure enough, said revolution is happening. LG's already integrated motion controls into its sets with its Magic Remote, and now Hillcrest is providing a solution just in case the rest of the world wants to buy in. Hailed as a "turnkey solution," the Freespace MotionEngine is now integrated with Broadcom's BCM35230 digital TV SoC and BCM20730 single-chip Bluetooth solution. That's a lot of technobabble, for sure, but the long and short of it is this: by integrating this into televisions, Freespace / BT-enabled remotes that are equipped with the right sensors can allow TV viewers to control channel switching, volume levels and who knows what else with just their hands. The outfit will be busting out a demo here in Vegas, so we'll be doing our best to catch a glance.

  • Elliptic Labs to show off gesture-sensing iPad dock at CES 2011

    by 
    Mike Schramm
    Mike Schramm
    12.22.2010

    Elliptic Labs has been working on gesture-sensing technology for a while now (where you can just swipe your hand in the air instead of actually touching a screen), and rumor has it that the company will be showing off a dock at next month's CES show for the iPad -- something that enables you to control Apple's magical and revolutionary device without actually touching it. The main use is apparently in the kitchen (where your hands might be messy from cooking, keeping you from wanting to grease up that screen like a pie plate), but I can see this functionality in a kiosk somewhere, or any system where you wouldn't want people actually laying hands on a device. There's a quick video after the break featuring one of the company's other devices, but presumably the same gestures would be used to control the iPad. It'll be interesting to see, too, just what kind of functionality the controller can offer. Swiping between screens wouldn't be hard, but I'd like to know if it offers any more granular control as well. Fortunately, TUAW will be live at CES, so we'll make sure to stop by Elliptic's booth and give it a try to let you know what it's like.

  • Elliptic Labs set to save your iPad from smudges with 3D gesture-sensing dock (video)

    by 
    Tim Stevens
    Tim Stevens
    12.22.2010

    The dream of kitchen computing still isn't here, with many chefs forced to read from archaic paper-based recipe lists or, worse yet, memorize the things. Maybe all we need is a way to interact with our gadgets without getting them all messy, and maybe Elliptic Labs can get us there. Finally. The company has been teasing us with its 3D gesture interface for years now and it looks set to finally show off an actual product, a motion-sensing iPad dock prototype making its debut at CES in a few weeks. The idea is you perch this sucker in your kitchen and it gives you full control whether you're kneading sourdough or mixing meatballs, keeping your tablet streak-free -- and hygienic. That seems like somewhat limited usefulness to us, but check out the video of an earlier prototype below and see if it doesn't make you want to bake some cookies. And, if it does, feel free to bring us some.

  • Kinect admits itself to hospital, treated for gesture control of medical images

    by 
    Darren Murph
    Darren Murph
    12.21.2010

    At this point, we all have a serious question to ask: is there anything the Kinect can't do? While Microsoft has managed to move quite a few of the camera-laden tubes, a good amount of 'em have been put to use in applications not named gaming. Take the Virtopsy Project, for instance. This particular setup uses the Kinect camera bar to control a PACS system (OsiriX, in this case), and it relies on software based on "ofxKinect, libfreenect and open frameworks." That's a lot of technobabble for sure, but as they say, the proof is in the YouTube video. Ever dreamed of swirling medical images around with hand gestures? Head on down and mash play -- fantasyland awaits.

  • Kinect finally fulfills its Minority Report destiny (video)

    by 
    Vlad Savov
    Vlad Savov
    12.09.2010

    Not to denigrate the numerous fine hacks that Kinect's undergone since its launch, but it's always nice to see the professionals come in and shake things up a little. A crew from MIT's brain labs has put together a hand detection system on Microsoft's ultra-versatile cam, which is sophisticated enough to recognize the position of both your palms and fingers. Just as a demonstration, they've tied that good stuff up to a little picture-scrolling UI, and you won't be surprised to hear that it's the closest thing to Minority Report's interactive gesture-based interface that we've seen yet. And it's all achieved with a freaking console peripheral. Video after the break.

  • Sony fires barrage of touchscreen patent applications, only one points at new PSP

    by 
    Sean Hollister
    Sean Hollister
    11.28.2010

    Lawyers for Sony Computer Entertainment America must have been mighty busy last October, hatching the wild scheme that came to light this week -- a series of eight intertwining patent applications all describing a single device with an intriguing touchscreen interface. Though it's hard to tell what form the final device might take -- the apps suggest sliders, clamshells and slates -- a few distinct ideas bubble to the surface, and we'll knock them out one by one. First, the inventors seem to be rather particular about having a touchpad that's separate from the main screen -- perhaps even on its back like the rumored PSP2 -- and Sony's trying to patent a way to manipulate objects through the screen as well. Second, there's a lot of mumbo-jumbo about being able to "enhance" or "transform" the user interface in response to different forms of input, which seems to boil down to this: Sony's trying to get some multitouch up in there, especially pinch-to-zoom. Last but not least, the company's looking to cordon off a section of touchscreen buttons, including a 'paste' command, and patent a "prediction engine" that would dynamically change the onscreen layout based on your past behavior. If most of these ideas sound more at home in a new tablet computer rather than a gaming handheld, then great minds think alike. Still, SCEA is Sony's gaming division -- forlorn Linux computing aside -- so consider us stumped for now.

  • Kinect hacks let you control a web browser and Windows 7 using only The Force (updated)

    by 
    Thomas Ricker
    Thomas Ricker
    11.25.2010

    Hacking the Xbox 360 Kinect is all about baby steps on the way to what could ultimately amount to some pretty useful homebrew. Here's a good example cooked up by some kids at the MIT Media Lab Fluid Interfaces Group attempting to redefine the human-machine interactive experience. DepthJS is a system that makes Javascript talk to Microsoft's Kinect in order to navigate web pages, among other things. Remember, it's not that making wild, arm-waving gestures is the best way to navigate a web site, it's just a demonstration that you can. Let's hope that the hacking community picks up the work and evolves it into a multitouch remote control plugin for our home theater PCs. Boxee, maybe you can lend a hand? Update: If you're willing to step outside of the developer-friendly borders of open-source software then you'll want to check out Evoluce's gesture solution based on the company's Multitouch Input Management (MIM) driver for Kinect. The most impressive part is its support for simultaneous multitouch and multiuser control of applications (including those using Flash and Java) running on a Windows 7 PC. Evoluce promises to release software "soon" to bridge Kinect and Windows 7. Until then be sure to check both of the impressive videos after the break. [Thanks, Leakcim13]

  • Harmonix details design process behind Dance Central's Kinect UI

    by 
    Mike Schramm
    Mike Schramm
    11.08.2010

    Creating a functional user interface through Kinect can't be an easy task -- how do you make waving your arms around mean something, after all? Harmonix was confronted with exactly this problem in developing Dance Central, and judging by reviews, the developer figured it out just fine. So what's the secret? The team had to actually teach players how to gesture, using visual and even aural feedback in the game when players got it right. Developer Ryan Challinor told the Montreal International Games Summit this week that in testing, player swipes varied widely across both speed and position, and so the final solution was to make the players react to the game, rather than programming the game to react to the players. A few different solutions were prototyped, however, including the utilization of a cursor, which was dragged around the screen and planted in "notches" to choose the game's songs or options. The team also tried to get players to interact in 3D space, either "pushing" on virtual buttons or "grabbing" and pulling a scroll wheel around. In the end, Challinor said, simply iterating these ideas was the key to solving the Kinect conundrum: The team implemented idea after idea, and then polished the one that worked best. Of course, not every development team responsible for the first batch of Kinect games took this approach. At least one studio seemingly went with its first idea, and then made it as raw as possible.

  • Microsoft buys Canesta, continues camera-based domination of our interfaces

    by 
    Sean Hollister
    Sean Hollister
    10.30.2010

    It seems that Microsoft's taken the camera to heart following its dismissal of the pen -- the company bought 3DV, collaborated with PrimeSense on Kinect, and today it's apparently finalized a deal to acquire 3D CMOS camera chipmaker Canesta as well. In case you've already forgotten, the latter company is the one that made an paid actor look particularly smug last year, by allowing the gent to control his television with a flick of the wrist. Things have progressed a good bit further than that, however, as you'll see in a demo video after the break, and Canesta president and CEO Jim Spare says he expects the company's stuffs to "see wide adoption across many applications that embody the full potential of the technology" under Microsoft's reign. Press release after the break.

  • Ubuntu prototype uses face recognition to intelligently move UI elements (video)

    by 
    Darren Murph
    Darren Murph
    09.20.2010

    (function() { var s = document.createElement('SCRIPT'), s1 = document.getElementsByTagName('SCRIPT')[0]; s.type = 'text/javascript'; s.async = true; s.src = 'http://widgets.digg.com/buttons.js'; s1.parentNode.insertBefore(s, s1); })(); Digg Not that we haven't seen mock-ups before for systems using webcams to intelligently move user interface elements, but it's another thing entirely for a company to make a public proclamation that it's tinkering with implementing something of the sort into a future build of its OS. Over at the Canonical design blog, one Christian Giordano has revealed that the company is in the early stages of creating new ways to interact with Ubuntu, primarily by using proximity and orientation sensors in order to have one's PC react based on how they're sitting, where they're sitting and where their eyes / head are at. For instance -- once a user fires up a video and leans back, said video would automatically go into fullscreen mode. Similarly, if a user walked away to grab some coffee and a notification appeared, that notification would be displayed at fullscreen so that he / she could read it from faraway. There's no mention just yet on when the company plans to actually bring these ideas to end-users, but the video embedded after the break makes us long for "sooner" rather than "later."

  • Nokia's Plug and Touch turns your HDTV into a giant N8 (video)

    by 
    Vlad Savov
    Vlad Savov
    09.17.2010

    What do you get when you combine the N8's HDMI output, its 12 megapixel camera, and your trusty old TV set? As Anssi Vanjoki might say, you get a big new smartphone. Nokia's research labs have thrown up a neat little "prototype" app called Plug and Touch, which enhances the N8's already famed HDTV friendliness with the ability to recognize touch input. This is done by positioning your aluminum-clad Nokia about five feet away from the display and letting its camera pick up your hand's gestures and touches, essentially resulting in a massively enlarged Symbian^3 handset device. Naturally, it's not terribly precise at this stage and there are no plans for an actual release, but it sure is a tantalizing glimpse of what may be coming down the pipe. Video after the break.

  • TI and XTR team up on touchless gesturing system for mobile devices

    by 
    Donald Melanson
    Donald Melanson
    09.15.2010

    We've seen a few examples of touchless, gesture-based interfaces for mobile devices, but it looks like Texas Instruments might be closer than most to making it a reality -- it's just announced a partnership with Extreme Reality (also known as XTR) on a new gesture engine and framework specifically designed for its OMAP 4 platform. The two companies actually showed off such a system back at MWC earlier this year (check out a demo of ti after the break), but they've only just now made the partnership official, and they're promising plenty more advancements to come -- including the ability to not only recognize simple gestures, but even things like whole body movements and two-handed gestures. Head on past the break for the complete press release.

  • Multitouch DJ table lets you swipe to rock

    by 
    Donald Melanson
    Donald Melanson
    08.09.2010

    We just recently got a glimpse of one possible future of DJing, but our world has now already been turned upside down once again with this multitouch-enabled rig built by Gregory Kaufman. The big difference with this one, as you can probably guess, is that it employs a gesture-based interface that lets you spin the virtual turntables and use a variety of taps and finger swipes to replicate the main functions of a regular DJ deck. What's more, Kaufman says that the only gear a DJ would have to carry is a USB drive with their own music and settings, which they'd simply plug into the multitouch table at a club -- assuming the idea catches on, that is. Top top things off, the system would also be able to accommodate regular DJ gear for some added flexibility, and even provide enough room for two DJs if you're looking to battle or share the stage. Head on past the break to check it out in action.

  • Microsoft's Kinect patent application goes public, reveals gobs of fine print

    by 
    Darren Murph
    Darren Murph
    08.05.2010

    Not like it matters much now -- after all, Project Natal has had time to incubate and come out as Kinect -- but those wondering about the inner workings of the motion sensing system now have more one outlet to investigate. A patent application filed on February 23, 2009 was just made public today, describing a "gesture keyboarding" scenario where users can make gestures that are caught by a "depth camera" and then converted into in-game controls. Of course, those paying close attention could've read between the lines when we toyed with a camouflaged PrimeSense demo at GDC, but here's the fully skinny in black and white. And a bit of blue, if you count the buttons. Give that source link a tap once you're settled in. %Gallery-98933%

  • Hitachi shows off new gesture-based interface, touts grand plans

    by 
    Donald Melanson
    Donald Melanson
    07.29.2010

    Hitachi's already dipped its toes (or hands, as it were) into the gesture-based waters before, but it looks to have refined things quite a bit for its latest Minority Report-esque demo, which the company is showing off as part of its 100th anniversary celebration. While complete details are a bit light, the system does seem to be reasonably responsive, and appears to rely on a projection-based system and a single camera to track movements. Perhaps what's most interesting, however, is that Hitachi eventually sees systems like this being used in everything from digital signage to medical applications -- and, yes, even TVs and desktop computers (though not before mid-2011 at the earliest). Head on past the break to check it out in action.