gestures

Latest

  • Microsoft Research toys with the cosmos... using forefinger and thumb (video)

    by 
    Sean Hollister
    Sean Hollister
    05.31.2010

    We've always been suckers for Minority Report tech, and Microsoft Research's latest attempt is not to be missed. Thought pinch-to-zoom was quaint? Try pinching the sky in this geodesic dome. Though the cardboard-and-paper-clip structure isn't all that (unless you're the arts and crafts type), the inside houses a projectiondesign DLP unit with a custom infrared camera system that can turn simple hand gestures into virtual interstellar travel, 360-degree video teleconferencing and more. You'll find a pair of videos demonstrating the concept after the break, but try not to get too attached -- if you're anything like us, your poor heart can't handle another Courier axing.

  • Apple patents multitouch gestures

    by 
    Mike Schramm
    Mike Schramm
    04.30.2010

    Patently Apple reports that Apple has picked up a major patent from the USPTO for a long list of mutitouch gestures. The gestures all involve moving two or more fingers on a multitouch surface, and cover everything from cut and copy shortcuts to global search and find and replace motions. I'm sure creative types could probably come up with at least a few other ways to ways to move your fingers on a surface, but this one's pretty far-reaching. PA also notes that almost all of these gestures were picked up by Apple as part of their Fingerworks purchase -- while (as far as I know) not all of these gestures eventually made it to the iPad, we can probably expect to see them show up in the future. Also picked up from the USPTO includes a patent for adjusting the tempo of music played by an iPod, possibly even according to some body metric (like putting a strap on your arm to measure your pulse by using Nike+), one patent for antenna structure, and a few different patents dealing with digital files and folders. As always, just the fact that Apple is patenting these ideas doesn't ever mean that we'll for sure see them in future updates of software or hardware. But watching these patents is a good way to keep an eye on what's coming out of Apple's R&D departments.

  • Inertial scrolling should be possible on all multi-touch trackpads

    by 
    Chris Rawson
    Chris Rawson
    04.24.2010

    A new feature called "inertial scrolling" has been introduced in the latest MacBook Pros. This feature changes the way that scrolling functions in OS X, making it behave more like the iPhone. Traditionally, when you use two-finger scrolling in OS X, scrolling stops dead as soon as your fingers stop moving. On the iPhone, however, there's a certain "momentum" to scrolling that is entirely dependent on how quickly you flick your finger to scroll; slow scrolling motions have almost no momentum to them at all, while fast flicks mean the screen continues to scroll long after your finger has left the tracking surface, possibly even scrolling all the way to the top or bottom of what you're scrolling through in a matter of seconds. Many people prefer the way scrolling behaves on the iPhone compared to the Mac, so it's been introduced as an optional behavior in the newest MacBook Pros. Since the multi-touch trackpads on the MacBook, MacBook Air, and MacBook Pro use essentially the same multi-touch hardware as the iPhone, it's been possible to bring this same scrolling behavior into OS X. However, it only works on the newest MacBook Pros for now. I suspected that there wasn't any reason this new inertial scrolling behavior couldn't be implemented on the older multi-touch trackpads, so I spent most of the morning investigating how to get it working on my Early 2008 MacBook Pro (the first model of MacBook Pro with a multi-touch trackpad). Read on to find out what I discovered.

  • Synaptics extends multitouch Gesture Suite to Linux, Chrome OS included

    by 
    Vlad Savov
    Vlad Savov
    04.20.2010

    Well, it had to happen at some point. After eons of watching Mac OS and Windows users swiping away nonchalantly on their touchpads, Linux laptop buyers can now also join the multitouch fray. Synaptics has announced official Gesture Suite support for a wide range of Linux-based OS flavors -- Fedora, Ubuntu, RedFlag, SuSE, and Xandros get name-dropped, while future support for Chrome OS is promised -- which will all benefit from its set of multi-fingered touch and swipe responses. The infamous pinch-to-zoom is quite naturally included in the Suite, which will come bundled with new installations of those operating systems. We're not seeing any mention of a downloadable update as yet, but we imagine that'll be corrected in due course, whether by the company itself or the resourceful Linux community. Full PR after the break.

  • iPad apps: defining experiences from the first wave

    by 
    Sean Hollister
    Sean Hollister
    04.02.2010

    There are now over 1,348 approved apps for the iPad. That's on top of the 150,000 iPad-compatible iPhone programs already available in the App Store. When Apple's tablet PC launches, just hours from now, it will have a software library greater than that of any handheld in history -- not counting the occasional UMPC. That said, the vast majority of even those 1,348 iPad apps are not original. They were designed for the iPhone, a device with a comparatively pokey processor and a tiny screen, and most have just been tweaked slightly, upped in price and given an "HD" suffix -- as if that somehow justified the increased cost. Besides, we've seen the amazing potential programs have on iPhone, Android, Blackberry, Windows Mobile and webOS when given access to a touchscreen, always-on data connection, GPS, cloud storage and WiFi -- but where are the apps that truly define iPad? What will take advantage of its extra headroom, new UI paradigms and multitouch real estate? Caught between netbook and smartphone, what does the iPad do that the iPhone cannot? After spending hours digging through the web and new iPad section of the App Store, we believe we have a number of reasonably compelling answers. Update: Now includes Wormhole Remote, TweetDeck, SkyGrid, Touchgrind HD, GoToMeeting, SplitBrowser, iDisplay, Geometry Wars and Drawing Pad.

  • Tromso students put together the best interactive display wall we've seen yet (video)

    by 
    Vlad Savov
    Vlad Savov
    03.24.2010

    Take everything you thought you knew about multitouch and throw it out. Okay, keep the Minority Report stuff, but throw everything else out. What we're looking at here is a 22 megapixel display, stitched together from the output of no less than 28 projectors (7,168 x 3,072 total resolution), which just happens to respond to touch-like input in a fashion even Tom Cruise would find fascinating. You don't have to actually touch the wall, floor-mounted cameras pick up your gestures in 2D space and a 30-node computer setup crunches all the computational and visual data to deliver some buttery smooth user interaction. For demo purposes, the makers of this system grabbed a 13.3 gigapixel image of Tromso and took it for a hand-controlled spin. See the mesmerizing show on video after the break.

  • AiLive shows off its LiveMove 2 software for building MotionPlus and PlayStation Move gestures

    by 
    Paul Miller
    Paul Miller
    03.19.2010

    If you've been following closely, there are really two sorts of input available to the PlayStation Move. The one that gets the most love and screen time is the camera-based, 3D meatspace tracking that the PlayStation Eye performs in conjunction with the fancy colored ball at the end of the PlayStation Move wand, but most of the actual gameplay we've seen is in truth much more similar to the Wii's MotionPlus than Sony might want to let on. The MotionPlus and PS Move have very similar configurations of gyroscopes and accelerometers, and actually use the same software from AiLive (co-creators of MotionPlus) for developing the gesture recognition that goes into games. We actually got to see the LiveMove 2 development environment in action, and it's pretty impressive: basically you tell a computer what gesture you want to perform (like "fist pump," for instance) and then perform a bunch of examples of that movement. LiveMove then figures out the range of allowable movement, and in playback mode shows you whether you're hitting the mark. AiLive showed us gestures as complicated as a Graffiti (of Palm OS yore) handwriting recognition in the air, built with just a few example movements from people back at their offices. So, this is great news for developers dealing with the significant complication of all these sensors, but at the same time we can't help but be a little disappointed. LiveMove 2 doesn't even use the PlayStation Eye, and as we mentioned in our hands-on impressions of PlayStation Move, we could really sense that a lot of our in-game actions were built from predefined gestures, not us interacting with the 3D environment in any "real" or physics-based way. It's great tech either way, but hopefully that's something that can be improved upon by launch or soon after. Check out a demo of LiveMove in action after the break.

  • German student shows off camera-based input on an iPhone

    by 
    Donald Melanson
    Donald Melanson
    03.12.2010

    Using a camera as an input device is hardly a new idea -- even on a mobile device -- but most examples so far have been to enable functionality not possible on a touchscreen. As Master's student Daniel Bierwirth has shown in the video after the break, however, a phone on a camera can also be used as an alternative input method for features like scrolling or zooming, potentially allowing for easier interaction on devices with smaller screens. Bierwirth also takes the idea one step further, and sees the system eventually including a second camera that's worn by a person, which would be able to detect when your hands are near the phone and allow for a range of other gestures. Check out his full report at the link below.

  • Google Gesture Search puts your contacts just a stroke away

    by 
    Chris Ziegler
    Chris Ziegler
    03.04.2010

    If you look under the hood, Google's been beefing up Android with support for gestures that developers can take advantage of, and the power and flexibility of that capability is now being demoed by none other than... well, Google, of course. Gesture Search -- which is currently being billed a Google Labs project -- lets you draw letters on the screen to reach contacts and other content on your phone, an especially nice shortcut for those who like to avoid the on-screen keyboard as much as possible. As it learns what you tend to search for, the quality of the searches improves, meaning you need to swipe less to get to frequently-accessed items. It needs Android 2.0 to do its thing, but if you've got a so-equipped phone, it's available now from the US-localized Market.

  • Battle of Puppets brings the fight to the marionette stage

    by 
    Mike Schramm
    Mike Schramm
    02.24.2010

    Battle of Puppets fell through the cracks here when it was first released, and that was unfortunate -- it's a 2D real-time strategy game for the iPhone with a distinctive art style and a plethora of cool features that definitely deserves a look. The team has been updating the app often, and the latest update not only adds in achievements and leaderboards via the OpenFeint platform, but also has dropped the price to just 99 cents for a limited time. The game has a surprising amount of strategic depth -- more than you'd expect for an iPhone title. There are five different armies (each one representing an "opera" theme), each with its own strengths and weaknesses, and the little unit battles play out over 22 different theaters, where various environmental changes can actually affect the strategy of the game. And in addition to commanding your puppet troops, you can "cast spells" on the screen with gesture input, which will also affect the flow of battle. It's almost too much to handle on the iPhone's small screen, but the developers have added in a tutorial that should help you figure it out. At 99 cents, Battle of Puppets is a steal. More casual game players might get a little dragged down by the complexity, but those looking for strategy will find it underneath the cutout art designs. And if they continue to update the app at the rate they have so far, there'll be even more coming in the future.

  • Texas Instruments introduces ARM-based OMAP 4 SOC, Blaze development platform

    by 
    Vlad Savov
    Vlad Savov
    02.15.2010

    Texas Instruments has just made its OMAP 4 system-on-chip official, and garnished the announcement with the first development platform for it, aggressively titled Blaze. We already caught a glimpse of it in prototype form earlier this month, and the thing is quite a whopper -- you can see it on video after the break and we doubt you'll accuse TI of placing form before function with this one. The company's focus will be on promoting innovative new modes of interaction, with touchless gesturing (or "in the air" gesture recognition) figuring strongly in its vision of the future. Looking at the SOC diagram (available after the break), you'll find that its grunt will be provided by the same ARM Cortex-A9 MPCore class of CPU that powers the iPad, though TI claims it will be the only mobile platform capable of outputting stereoscopic 720p video at 30fps per channel. Perhaps its uniqueness will come from the fact that nobody else cares for the overkill that is 3D-HD on a mobile phone, whether it requires glasses or not. It'll still be fascinating to see if anybody picks up the chunky Blaze idea and tries to produce a viable mobile device out of it -- we could be convinced we need multiple displays while on the move, we're just not particularly hot on the 90s style bezel overflow.

  • Apple granted patent for touch-sensitive bezel

    by 
    Mel Martin
    Mel Martin
    02.03.2010

    The Patently Apple website is reporting that Apple has been granted patents dealing with tablets and advanced touch technology. The first patent concerns an 'intelligent bezel' where a user could control volume, brightness, zoom or even controls for games by sliding a finger along the edge of the device. A second patent was also granted for tracking multiple finger and palm recognition as hands approach, touch and slide across a multi-touch surface. Taken together, the patents hint that Apple is working on some very futuristic hardware and software platforms that go beyond the simple touch screens Apple offers now. When you look at the current iPad, you see a very wide bezel that has no touch functions now, but it is easy to imagine how a future tablet could incorporate the new features. Now, if they can just get a camera in there... [Via MacRumors]

  • Apple patent filings outline input device gestures, solar iPods and iPhones

    by 
    Steve Sande
    Steve Sande
    01.23.2010

    Apple's reputation as an innovator doesn't seem to be waning at all. Two recent patent applications published Thursday on the U.S. Patent and Trademark Office's Web site show that Apple is dreaming of new gestures using input devices and solar-powered iPods and iPhones. These applications are just a few of a recent parade of patents we've been following. While these are just filings for patent protection and not actual products, Thursday's "Methods and Apparatus for Processing Combinations of Kinematical Inputs" filing is very intriguing in light of the upcoming announcements. As described in the filing, "Some embodiments of the present invention therefore enable a user to provide a series of gestures as input to the receiving device. Such gestures may include, for example, brushing motions, scooping motions, nudges, tilt and slides, and tilt and taps. The application can then respond to each gesture (or gesture combination) in any number of ways." Hmmm... using a mouse on a "receiving device?" That could be an interesting way to perform tasks on a tablet device. We've heard some rumors about the tablet that describe new gestures that may take a bit of getting used to, and perhaps some of those are made to use a mouse in the manner described in the filing. It's also apparent that Apple is quite interested in making devices with virtually unlimited battery life. Another filing unveiled on Thursday, titled "Power Management Circuitry and Solar Cells," describes power management circuitry allowing portable devices like the iPod and iPhone to operate on solar power. The application details how both solar and battery power sources can be used to power the devices, using switches to reconfigure solar cells on the fly so that the device receives a constant voltage even when some cells are "shadowed" by a hand. How about it, TUAW readers? Are you ready for a solar-powered tablet you can tap, brush, and nudge with a solar mouse? It could happen some day, although these are patentable ideas and not actual products. [via MacRumors]

  • Apple patent application two-fer: new gesture inputs, solar-powered iPods?

    by 
    Donald Melanson
    Donald Melanson
    01.23.2010

    We know it's been tough with the dearth of Apple-related speculation as of late, but it looks like we now finally have a few more clues about what the company might be up to courtesy of a pair of recently published patent applications. The first of those is a new type of gesture-based input device, which would not only be able to detect swipes and other simple gestures, but things like brushing or scooping motions that take into account force and velocity (check out an example after the break). The other patent treads a bit of familiar territory for Apple, and describes a built-in solar power system for electronic devices -- such as an iPod, as illustrated above. That could apparently include solar cells covering the entire device, which could be configured to function even if they're partly obstructed by your hand. The system would also supposedly be able to detect if the battery is completely drained and rely solely on the solar cells to power up the device or, alternatively, switch the solar cells to a "second operational state" if it detects that the battery is charged -- if it ever actually exists, that is.

  • Editorial: Google's multitouch dilemma

    by 
    Nilay Patel
    Nilay Patel
    01.19.2010

    As anyone who's seen the last Engadget Show knows, we were incredibly lucky to have Google's Erick Tseng as our guest. Erick is product manager of Android at Google, and he's one of the sharpest, brightest, funniest guys around -- it was great having him on the show, and I sincerely hope we see a lot more of him as time goes by. It's obvious that Android is in capable hands. Of course, the problem with having someone as funny, sharp, and bright as Erick on the show is that they tend to come in extremely well-prepared, and Erick was no exception -- he'd read the many comments where you all asked for solid answers regarding the state of multitouch gestures on Android, and he had his answers ready and polished to a high shine. Like we've been hearing for months now, Erick told us that Android now supports the recognition of multiple touch inputs -- the basic definition of "multitouch" -- and that the real issue is actually how multitouch is implemented. It was a fascinating exchange that I encourage you to watch, but here's the main quote: When people say 'why doesn't Android have multitouch?' it's not a question of 'multitouch'... I want to reframe the question. We have multitouch -- what people are asking for is specific implementations in the UI that use multitouch, like pinch-to-zoom, or chording on the keyboard. That's a solid, respectable answer, and it was delivered with confidence, poise, and charm. There's just one problem: it's not actually an answer, because the semantics don't matter. No matter how you look at it, the lack of "specific multitouch implementations" is still a huge issue with Android -- one that's become a growing distraction.

  • Former Apple engineer: FingerWorks may be a part of the tablet

    by 
    Mike Schramm
    Mike Schramm
    01.11.2010

    The New York Times is the latest big source of tablet rumors today, and they went way back for the latest stab in the dark. All the way back to 2005, when Apple purchased a little company called FingerWorks, known for their work with gesture recognition on a multitouch interface like the Touchstream keyboard above. The Gray Lady says they spoke with "former Apple engineers" who have worked on the tablet itself, and those guys claim that it makes use of gesture recognition to operate: "for example, three fingers down and rotate could mean 'open an application.'" Another former employee told them that Apple's been putting together a multitouch version of iWork for years, and that the tablet is actually a full-featured Mac, not just an e-reader or larger iPod touch. Of course, we don't know how long ago these Apple engineers worked for the company -- given that the App Store has vastly changed things over there in just the last year and a half, the tablet itself could have changed its focus in that same period of time. At this point, given all of the things you can do with a multitouch screen, finger gestures are probably the least impressive. But then again, Apple's been obsessed with multitouch for a while, so it's not a stretch to think they might be included some of this FingerWorks technology in a larger multitouch screen. Wait and see, wait and see. Update: MacRumors now notes that some content on FingerWorks' website has been pulled despite being online since Apple made the purchase in 2005. Very interesting. [via MacRumors]

  • Samsung's WiFi-enabled CL80 adds touch of AMOLED to 14MP sensor

    by 
    Vlad Savov
    Vlad Savov
    01.06.2010

    Samsung's AMOLED obsession continues unabated as it has just taken the official wraps off the 3.7-inch touchscreen-equipped CL80. Already well detailed in an earlier leak, this shooter crams a 7x optical zoom lens plus WiFi and Bluetooth antennae inside one of those unreasonably thin enclosures that are all the rage these days. It's the twenty-teens now, so naturally you get a jumbo 14.2 megapixel sensor with optical image stabilization as well as a 720p movie mode. Coming out this spring, the CL80 has already garnered a 2010 CES Innovation Award, but if you can settle for a 3.5-inch conventional LCD and do without the wireless options, Samsung will sell you the otherwise identical TL240 at a presumably more affordable price point this February. Dive past the break for the full PR and specs. %Gallery-81298%

  • 3D UI patent snapped up by Apple in 2008: could be bases-covering, could be life-changing

    by 
    Paul Miller
    Paul Miller
    01.05.2010

    We've seen this done badly so many time that it's hard to imagine anyone so self serious as Apple taking a crack at it (even if they've already done so in the desktop space), but for whatever reason the company picked up this 3D UI patent back in 2008 under the guise of a few French employees. The patent was just released in December, and describes in some detail a method of zooming around in 3D using multitouch. Of course, this picture seems to imply that it's for jumping through some representative icons on a 3D plane, but the patent seems more concerned with the core mechanics of using multiple fingers at once to get around in 3D space and manipulate 3D objects -- and then going to great lengths to cover Apple's back in regards to multitouch, capacitive touch, and "multifunction" devices. So, this could be something we see in "the tablet," the next iPhone or even never, but at least we can rest assured that pinch to zoom won't be the only multitouch game in town forever.

  • Microsoft Research patents controller-free computer input via EMG muscle sensors

    by 
    Paul Miller
    Paul Miller
    01.03.2010

    We've seen plenty of far-fetched EMG-based input methods, like the concentration-demanding, head-based NeuroSky controller, but Microsoft Research is asking for a patent that involves much simpler gestures -- and might actually make a bit of sense. As demonstrated in the video after the break, Microsoft's connecting EMG sensors to arm muscles and then detecting finger gestures based on the muscle movement picked up by those sensors. It does away for the need of a pesky camera (or Power Glove) to read complicated hand gestures, and can even sense modified versions of the gestures to be performed while your hands are full. Microsoft's developing a wireless EMG sensor module that could be placed all over the body, and while like all Microsoft Research projects this seems pretty far from market, there's a small, optimistic part of us that could see some of the benefits here for controlling mobile devices. And boy do we love controlling mobile devices.

  • MIT gestural computing makes multitouch look old hat

    by 
    Vlad Savov
    Vlad Savov
    12.11.2009

    Ah, the MIT Media Lab, home to Big Bird's illegitimate progeny, augmented reality projects aplenty, and now three-dimensional gestural computing. The new bi-directional display being demoed by the Cambridge-based boffins performs both multitouch functions that we're familiar with and hand movement recognition in the space in front of the screen -- which we're also familiar with, but mostly from the movies. The gestural motion tracking is done via embedded optical sensors behind the display, which are allowed to see what you're doing by the LCD alternating rapidly (invisible to the human eye, but probably not to human pedantry) between what it's displaying to the viewer and a pattern for the camera array. This differs from projects like Natal, which have the camera offset from the display and therefore cannot work at short distances, but if you want even more detail, you'll find it in the informative video after the break. [Thanks, Rohit]