gesture

Latest

  • At last, phones will get ultrasound gesture control in first half of 2015

    by 
    Richard Lai
    Richard Lai
    10.06.2014

    We've been following Elliptic Labs' development on ultrasound gesture control for quite a while, but no time frame was ever given until now. Ahead of CEATEC in Tokyo, the company finally announced that its input technology -- developed in partnership with Murata -- will be arriving on phones in the first half of 2015. But that's not the only good news: On top of the usual swiping gestures for images, games and navigation (we saw some of this last year), there's now a new capability called "multi layer interaction," which uses your hand's proximity to toggle different actions or layers. It's potentially useful for glancing at different types of messages on the lock screen, as demoed in the video after the break.

  • Touch+ turns any surface into a gesture controller for your PC

    by 
    Jon Fingas
    Jon Fingas
    08.05.2014

    Haptix (now Ractiv) promised a cheap sensor that would turn any surface into a multi-touch PC controller, and it's making good on its word today by shipping the device worldwide. The $75 add-on has received a new Touch+ name and a sleek redesign, but it otherwise uses the same basic concepts we saw when the project got its crowdfunding a year ago. Put the device on a desktop stand or a laptop and its dual cameras turn your finger movements into gestures; you can play a first-person shooter by swiping across your keyboard, or sketch in Photoshop using your desk as a drawing tablet. It's definitely not for everyone, but it might be up your alley if you'd rather not reach out to a touchscreen (or use Leap Motion's in-air tracking) just to get more advanced input than a run-of-the-mill mouse or trackpad.

  • Samsung shows a smartwatch concept you control by waving your hands

    by 
    Jon Fingas
    Jon Fingas
    05.26.2014

    Touchscreens on smartwatches are limited by their very nature; there's only so much you can fit on a tiny piece of glass. However, Samsung might overcome that surface area limit if it ever implements a recently published patent application. Its smartwatch concept would let you perform relatively complex tasks just by waving your hands in front of a built-in camera. You could send content to a TV just by flicking your hand, for example, or select something by pointing your finger.

  • This bracelet lets you flick your wrist to pay with Bitcoin

    by 
    Jon Fingas
    Jon Fingas
    04.22.2014

    Sometimes, there's such a thing as being too forward-thinking. Take MEVU and its prototype payment bracelet as an example: the wrist-worn Bluetooth wallet lets you pay with Bitcoin using only air gestures. Sounds cool, right? In many ways, it is. As the company shows in its demo video, you can flick your wrist to cover parking or donate to charity without ever reaching for your wallet (or your phone, for that matter).

  • The story of iOS's text selection tool and the importance of those "handles"

    by 
    John-Michael Bond
    John-Michael Bond
    02.17.2014

    On February 11, 2014 the United States Patent and Trademark Office published the Apple's patent for the "selecting of text using gestures." Filed in 2008, and arriving on iOS devices in iOS 3 in June of 2009, you probably know this patent as "the thing that lets you highlight text on your iPhone." This tiny addition to iOS had a powerful impact on helping bridge the gap between smart phones and personal computers, and opened up the possibilities of working on your phone. To celebrate the release of these patent details the folks at Cult of Mac reached out to Apple's former user interface designer, Bas Ording, for an interview about the development of this important feature. The company knew text selection needed to find its way to iOS. Apple's old MessagePad personal planner utilized a stylus to cut-and-paste text, but according to Ording Steve Jobs refused to consider requiring a stylus on the iPhone. This left developers with figuring out how to allow users to select text on a tiny screen utilizing nothing but grubby fingers on the iPhone's screen. The breakthrough came along with the concept of adding movable "handles" at both ends of the text, which would allow users to precisely make selections without having to cover up the words they're looking for. Ording points to this design as a major factor in the success of the text selection tool. Some people called them 'lollypop sticks' and we played around with a bunch of different ideas [regarding] how best to do them. It started out with much larger visible handles. We ended up making them smaller and smaller, until they were just dots. You see, it turns out your fingers are actually pretty precise. If there's a grain of sand on your desk, you can easily target it despite it being a tiny dot. On-screen the dots are small, but in software the invisible active area is much larger, so they're easy to grab." Cult of Mac's interview goes on to discuss what working with Jobs on early iPhone development was like, and includes a few interesting tidbits about early iPad prototypes which were used for iOS development. You can read the complete story here. If you're interested in taking a look at Apple's patent for "Selecting of text using gestures" head over to the U.S.P.T.O. and gaze upon the blueprint for a tool we all probably take for granted.

  • Thalmic Labs' Myo armband does gesture control with muscles (video)

    by 
    Richard Lai
    Richard Lai
    01.08.2014

    2013 saw the rise of gesture cameras for TVs and various smart devices, but Canadian startup Thalmic Labs thinks its Myo armband is the way forward. During our meeting at CES earlier, co-founder and CEO Stephen Lake explained that his Bluetooth 4.0 device features a new type of biosensor, which can pick up minute electrical impulses in our arm muscles. This allows any wrist movement, finger twitch or fist clenching to be interpreted as a gesture, so long as the inner side of the Myo has skin contact. There's also an accelerometer, a gyroscope and a magnetometer, so arm and body movements are accounted for as well. The idea of Myo traces back to the co-founders' university days, where they explored various wearable technologies while working on a navigation aid for the blind. Lake said since brain control isn't quite there yet, his team found muscle sensing to be the next best thing. From what we saw and tried today, Thalmic Labs seems to be on the right track: We watched co-founder Aaron Grant play Call Of Duty: Ghosts using just a pair of Myos, and he was able to make his avatar run, crouch, jump, fire weapon and reload. Lake also gave a demo on music playback control and slideshow presentation on an iPad, both of which worked just fine. But it doesn't stop there; the CEO also sees opportunity in industrial robotics, space application and even gesture-based authentication. The retail version of the Myo will arrive within the first half of 2014, and not only will it be half as thick as the Myo Alphas shown today, but it'll also feature at least two core applications that will make full use of the armband. Lake said he'll be showing the final design in the next couple of months, but if you're game, you can now head over to Thalmic Labs' website to pre-order a black or white one for $149. Need more convincing? Then check out our in-depth demo video after the break. Update: We also got to see how you can fly a Parrot AR.Drone 2.0 with a Myo! Check it out.

  • OnTheGo Platforms is bringing gesture recognition to Google Glass apps (video)

    by 
    Alexis Santos
    Alexis Santos
    01.08.2014

    Google Glass can hold its own when it comes to voice recognition and touch, but its current software doesn't account for gesture controls. OnTheGo Platforms, however, is looking to fix that. The folks at the Portland, Ore.-based company are baking up an SDK for developers to integrate gesture recognition in apps made for Glass and other Android-based smart glasses, such as the Vuzix M100. We went hands-on with a demo photo-snapping and gallery app to put the software through its paces. In its current form, the solution recognizes swipes from the left and right, a closed fist and an open hand. A fist aimed at Glass' camera will fire off a countdown for a snapshot or take you to the app's home, depending on the current screen. Waving a hand in either direction cycles through pictures in the gallery. This editor was tempted to swipe his hand across the camera's view quickly, but the software is tuned to pick up slower, more deliberate motions about a foot or so away. The detection was often hit or miss, but the developers say they're in the process of refining the recognition and that they've recently eliminated many false positives.

  • Tobii and SteelSeries team up to launch eye-tracking game controller

    by 
    Matt Brian
    Matt Brian
    01.03.2014

    After spending the better part of a year fine-tuning its technology for Windows 8 machines, eye-tracking specialist Tobii is looking to conquer new market: gaming. With CES just days away, the Swedish company announced today that it has partnered with gaming accessory maker SteelSeries to launch what both companies call "the world's first mass-market consumer eye-tracking device for gamers." SteelSeries doesn't have anything to show us just yet, but tells us that its new gaming gear will let players signal their intent, aim and express emotions inside supported games. In the meantime, we hope to catch up with Tobii when it takes to the CES floor with its EyeX Controller, giving us an insight into what its partner has in store when it launches its first eye-tracking gaming products in mid-2014.

  • Samsung's 2014 smart TVs will let you control videos by pointing your finger

    by 
    Jon Fingas
    Jon Fingas
    12.22.2013

    Samsung's 2014 smart TV lineup may revolve around impressive-looking hardware, but the Korean tech giant has revealed that interface improvements will also play an important role. Its new TVs will support finger gestures that should be more intuitive than the whole-hand commands of this year's models; you can stop a movie with a spinning motion, for instance. Voice control will also be more powerful. It's at last possible to change channels or launch apps with a single step, and search results appear in one place. While the gesture and voice upgrades may not be revolutions, they'll likely be welcome to viewers frustrated with unwieldy TV software.

  • Oculus Rift-based virtual reality game could help restore 3D vision (video)

    by 
    Jon Fingas
    Jon Fingas
    11.25.2013

    Many will tell you that video games are bad for your eyes, but James Blaha doesn't buy that theory. He's developing a crowdfunded virtual reality title, Diplopia, that could help restore 3D vision. The Breakout variant trains those with crossed eye problems to coordinate their eyes by manipulating contrast; players score well when their brain merges two images into a complete scene. Regular gameplay could noticeably improve eyesight for adults that previously had little hope of recovering their depth perception, Blaha says. The potential solution is relatively cheap, too -- gamers use an Oculus Rift as their display, and they can add a Leap Motion controller for a hands-free experience. If you're eager to help out, you can pledge $20 to get Diplopia, and $400 will bundle the app with an Oculus Rift headset. Check out a video demo of the therapeutic game after the break.

  • YouTube app arrives in time for Xbox One launch

    by 
    David Hinkle
    David Hinkle
    11.20.2013

    YouTube will be available on Xbox One at the console's launch this Friday, November 22. You'll be able to download the app for free just as soon as you get that day one update out of the way first. The YouTube app on Xbox One will fully incorporate gesture and voice commands. You can use motion to scroll through pages and select videos, while the green phrases in the image above are voice shortcuts – simply say "YouTube" followed by the appropriate phrase for the preferred action. The Xbox One will be available in 13 different markets on Friday. Feel free to check out the Xbox One launch event page for our latest coverage and be sure to read our in-progress Xbox One review for our thoughts on Microsoft's next-generation console.

  • You can't wake the PlayStation 4 with voice commands, but you can shut it off

    by 
    Ben Gilbert
    Ben Gilbert
    11.11.2013

    Unlike the Xbox One's philosophy with Kinect voice commands that control everything from turning on the console to searching the online store, the PlayStation 4's camera / mic array can only interpret a handful of relatively basic voice commands. The console can't be "woken" while in standby mode by voice, for instance, nor can you command a video playing on Netflix to pause. In fact, at launch, none of the third-party apps on the PlayStation 4 will allow for voice commands -- something Sony reps tell us they "hope" more apps will integrate in the future. The console can be turned off using voice, and you can command it to open games. At Sony's big PS4 review event in New York City this week, few of the commands were demoed, and zero gesture commands were shown. Outside of facial recognition for logging in, it looks like Sony's next game console isn't focusing too much on competing with Microsoft on a point-by-point basis in terms of camera / mic-based input. Of course, reps also told us that more functionality will be added over time. And hey, considering that the camera may actually be included in the retail box down the line, it sounds to us like you're probably safe waiting for a few on grabbing one this Friday -- unless you really, really want to play The Playroom. Frankly, we respect that.

  • The future of motion interfaces: Wave goodbye to the mouse

    by 
    Sean Buckley
    Sean Buckley
    11.10.2013

    We're still big fans of Douglas Engelbart's original pointing device, but human/computer input is moving past traditional peripherals. We're rapidly approaching a future of touchscreens, motion sensors and visual imaging control solutions. "Gone are the days, probably, of the keyboard, mouse and maybe even touch input," Samsung's Shoneel Kolhatkar told us. During a panel on the future of gesture and motion controls at Expand NY, Kolhatkar suggested that these technologies could fade away within the next 20 years. His fellow panelists, Pelican Imaging's Paul Gallagher and Leap Motion's Avinash Dabir agree that there's more to the future of computing than the traditional point and click.

  • Google gesture patent would let Glass wearers 'heart' real-world objects

    by 
    Jon Fingas
    Jon Fingas
    10.15.2013

    As it stands, Google Glass doesn't have a simple way of cataloging real-world items -- you have to snap a picture and make a note afterward. It may get much easier if Google implements a newly granted US patent, however. The technique uses a wearable display's camera to detect hand gestures made in front of objects. Make a heart shape and you'll "like" what's front of you; frame something with your fingers and you'll select it. There's no certainty that Glass will ever support these commands, but they're certainly intuitive. If nothing else, they could lead to a new, very literal take on Google Goggles.

  • ARM and eyeSight optimize gesture control for mobile processors

    by 
    Jon Fingas
    Jon Fingas
    10.14.2013

    Hands-free gesture control is no longer a novelty in the mobile space, but the required processing power limits what's currently possible. More sophisticated input may be close at hand, however, as eyeSight has just teamed up with ARM to optimize its gesture control for chips using Mali-T600 series graphics. By relying on the T600's general-purpose computing engine, eyeSight's software can now track 3D movement, facial expressions and finger gestures without a huge performance hit. While companies will have to build supporting apps and devices before we see eyeSight's technology in use, it could lead to Kinect-like control of phones and smart TVs using relatively ordinary silicon.

  • Surface 2 Touch Cover supports gesture control, comes in more colors

    by 
    Joseph Volpe
    Joseph Volpe
    10.08.2013

    Microsoft's Surface 2 is just two weeks away from hitting retail, so in the lead up to that Windows 8.1 tab's launch, the company's released a 'Making of...' video to whet consumers' appetites. As spotted by The Verge, this latest Surface 2 video focuses mainly on the innovations made to the new Touch Cover. Aside from increasing the number of colorful hues it'll be made available in (i.e., green and orange), Microsoft's revealed that this new Touch Cover will be able to support gestures. That's thanks to the new sensor array used which favors a high-resolution matrix over the pressure-sensitive sensors on last-gen's model. Just what exactly those gestures may be, we can't say for sure, as Microsoft hasn't detailed them. But if current reports are any indication, you should be able to trigger that Charm menu by swiping from the right on your Touch Cover's keyboard. Is any of this enough to sway your (credit card holding) hand and get you sign up for Surface 2? Let us know in the comments below and be sure to check out the promo video after the break.

  • Google applies for patent on gesture-based car controls

    by 
    Jon Fingas
    Jon Fingas
    10.03.2013

    So far, in-car gestures aren't useful for much more than raging at the driver that just cut you off. Google wants these gestures to be more productive, however, and has applied for a patent that uses hand motion to control the car itself. Its proposed system relies on both a ceiling-mounted depth camera and a laser scanner to trigger actions based on an occupant's hand positions and movements. Swipe near the window and you'll roll it down; point to the radio and you'll turn the volume up. While there's no guarantee that we'll see the technology in a car, the USPTO is publishing the patent filing just a day after Google has acquired a motion control company. If nothing else, the concept of a Google-powered, gesture-controlled car isn't as far fetched as it used to be.

  • Elliptic Labs releases ultrasound gesturing SDK for Android, will soon integrate into smartphones

    by 
    Darren Murph
    Darren Murph
    10.01.2013

    Elliptic Labs has already spruced up a number of tablets by adding the ability to gesture instead of make contact with a touchpanel, and starting this week, it'll bring a similar source of wizardry to Android. The 20-member team is demoing a prototype here at CEATEC in Japan, showcasing the benefits of its ultrasound gesturing technology over the conventional camera-based magic that already ships in smartphones far and wide. In a nutshell, you need one or two inexpensive (under $1 a pop) chips from Murata baked into the phone; from there, Elliptic Labs' software handles the rest. It allows users to gesture in various directions with multiple hands without having to keep their hands in front of the camera... or atop the phone at all, actually. (To be clear, that box around the phone is only there for the demo; consumer-friendly versions will have the hardware bolted right onto the PCB within.) The goal here is to make it easy for consumers to flip through slideshows and craft a new high score in Fruit Ninja without having to grease up their display. Company representatives told us that existing prototypes were already operating at sub-100ms latency, and for a bit of perspective, most touchscreens can only claim ~120ms response times. It's hoping to get its tech integrated into future phones from the major Android players (you can bet that Samsung, LG, HTC and the whole lot have at least heard the pitch), and while it won't ever be added to existing phones, devs with games that could benefit from a newfangled kind of gesturing can look for an Android SDK to land in the very near future. Mat Smith contributed to this report.

  • Homebrew Kinect app steers Chromecast streams through gestures (update: source code)

    by 
    Jon Fingas
    Jon Fingas
    08.18.2013

    Chromecast may deliver on promises of sending wire-free video to TVs, but it's not hands-free -- or at least, it wasn't. Leon Nicholls has unveiled a homemade Kinect app for the desktop that gives him gesture-based control of videos playing through Google's streaming stick. While there's just two commands at this point, Nicholls hopes to open-source the code in the near future; this isn't the end of the road. If you can't wait that long, though, there's a quick demonstration available after the break. Update: A few days later, Nicholls has posted the source code for his project; you'll need to whitelist your Chromecast for development to use it.

  • Google updates Gesture Search, now recognizes over 40 languages

    by 
    Myriam Joire
    Myriam Joire
    06.13.2013

    Gesture lovers and polyglots rejoice! Yesterday, Google updated Gesture Search for Android phones and tablets, making it compatible with even more languages. The app provides quick access to music, contacts, applications, settings and bookmarks -- to name some -- by letting users simply draw characters on the screen. It now recognizes over 40 languages and even handles transliteration, which comes in handy in Chinese, for example, where some native characters require more strokes than their latin equivalents. Gesture Search started life as a Google Labs project back in March 2010 and received several tweaks over the years, including tablet support last fall. So go ahead: download the latest version from the Play Store and swipe away.