gesturecontrol

Latest

  • HP's Leap Motion keyboard to be sold separately for $99

    by 
    Richard Lai
    Richard Lai
    06.06.2014

    The Leap Motion controller is currently present in three forms: a $74.99 standalone dongle, inside the special edition HP Envy 17 laptop and inside an HP keyboard. The dongle -- with almost half a million units sold since launch -- and the keyboard are obviously the only ways to add this hand motion sensor externally, but the latter option was limited to select HP computers to begin with. Well, not any more. At Computex, Leap Motion told Engadget that as of this month, you'll be able to purchase said keyboard for about $99, and it'll work on any Windows 7 or Windows 8 PC as long as you have the software installed -- be it the current version or the free V2 update with skeletal tracking coming this summer.

  • Connected car model gives us a glimpse of the automobiles of the future

    by 
    Mariella Moon
    Mariella Moon
    06.06.2014

    Your dream car is (choose one): A.) A bumper-to-bumper candy-color paint-job, ostrich seats and 22-inch chrome rims. B.) One with everything connected cars can offer, such as in-car 4G LTE and WiFi. If you chose B, then you have to see the Automotive Parts Manufacturers Association's connected car demo. In addition to having an internet connection, it also features wireless charging capability, 360-degree proximity sensors, as well as (we're saving the best for last here) gesture control and anti-drunk driving technology. We assume the car has an onboard breathalyzer of some sort, as you wouldn't even be able to start it if you've had one too many.

  • Samsung shows a smartwatch concept you control by waving your hands

    by 
    Jon Fingas
    Jon Fingas
    05.26.2014

    Touchscreens on smartwatches are limited by their very nature; there's only so much you can fit on a tiny piece of glass. However, Samsung might overcome that surface area limit if it ever implements a recently published patent application. Its smartwatch concept would let you perform relatively complex tasks just by waving your hands in front of a built-in camera. You could send content to a TV just by flicking your hand, for example, or select something by pointing your finger.

  • Mirama hopes to replace your smartphone with a headset, your camera button with a finger gesture

    by 
    Mat Smith
    Mat Smith
    03.25.2014

    Mirama wants do away with the the camera button on your smartphone. In fact, don't even get the smartphone out to begin with. Its prototype headset lets you frame and take a picture using your hands and its built-in camera. No buttons, no vocal cues, not even a wink. Other gestures, registered by the camera sensor, add the ability to confirm or cancel (thumbs up and down, respectively) while you can even attempt a hand-written message ... if you have the right level of motor control. (Our own gestured penmanship during a quick demo proved we didn't.) The cameras on the left and right, meanwhile, offer a constant stream of what you'd be looking at if you weren't wearing the headset. Gestures (well, your hands) then appear in bright cyan on top of the video feed.

  • Lumus and eyeSight deal brings gesture control to DK-40 smart glasses hand-on

    by 
    James Trew
    James Trew
    02.25.2014

    Imagine a pair of Smart Glasses that you didn't need to fondle just to dismiss notifications, or worse, speak out loud to like a crazy. It's that exact thought that brought Lumus and eyeSight together. If you know each company's respective products, you don't have to imagine too hard what's going on, but if you don't? Basically Lumus makes a Google Glass-like product that has a see-through display in one lens that shows notifications, calendar entries and so on. EyeSight makes software that allows gesture control through existing cameras (like that one in your laptop or phone), and has, for example, just penned a deal with Oppo that sees the software baked into its phones' native controls (wave to browse the gallery, etc.). The collaboration between Lumus and eyeSight was announced here at MWC, and we just swung by to take a look. The Lumus glasses we saw back at CES are unchanged, but this time around, you can dismiss email and social media notifications (for example) with a simple flick, or pull out reminders and calendar entries from the side with a swipe. We were mighty impressed with how well it worked, and how fluid and responsive the interaction was. Of course, this is only a simple demonstration of what is possible, but it's not hard to imagine more creative applications for it further down the line (games, media control, etc.). Of course, some might argue that waving around in the empty space in front of you is no less conspicuous than talking to yourself, but once you've tried it for yourself, we think you'll agree it's a much easier way to interact with a HUD than an out-of-sight touchpad or unreliable voice command. Lumus still hasn't confirmed commercial plans for its glasses, so it could be a while before you decide for yourself. Steve Dent contributed to this report.

  • Exclusive: PMD's CamBoard Pico XS is the tiniest gesture camera we've ever seen (video)

    by 
    Richard Lai
    Richard Lai
    01.10.2014

    Just as we were wrapping up CES today, we caught up with our friends over at PMD Technologies who surprised us with a little exclusive. What you see above is the new CamBoard Pico XS gesture camera that's dwarfed by the Pico S -- the one we saw at Computex -- next to it. This tiny module is only 4mm thick, 39.5mm long and 15.5mm wide, making it 1.5mm thinner and almost half as long as its predecessor, while still packing the same Infineon IRS1010C 3D image sensor chip. Given the size plus the fact that it already uses MIPI (Mobile Industry Processor Interface) instead of USB, the Pico XS is truly integration-ready for OEMs. The main changes that enabled this size reduction are the smaller lens -- which is compensated by a sharper laser illumination (but still 850nm infrared) -- plus the removal of the aluminum heat sink (which is actually the chassis), courtesy of a much lower power consumption. Instead of a typical 1W you get on the Pico S, the Pico XS requires less than 50mW typically (at 25fps) and 350mW max (up to 45fps). Temperature-wise it goes up by just 10 degrees Celsius at most, apparently. Despite the slightly reduced viewing angles, we've been told that this smaller depth camera offers the same performance as before. That certainly seems to be the case after this author tried it using PMD's Nimble UX middleware (co-developed by 3Gear Systems), which is able to do two-hand skeletal tracking down to finger level, as shown in our video after the break.

  • When Parrot AR.Drone meets Myo armband, magic ensues (video)

    by 
    Richard Lai
    Richard Lai
    01.10.2014

    Ah, Las Vegas, the perfect location for a spontaneous wedding. Earlier today, we witnessed a quick and dirty collaboration between Parrot and Thalmic Labs at CES, where they paired an AR.Drone 2.0 with a Myo gesture-control armband. The demonstrator was able to control the drone's tilt direction by using just one arm, as well as toggling the rotors by clicking fingers. This author gave it a go as well and found this control method to be as effortless as it looked, though due to the WiFi interference on the show floor (ugh, so typical of large events these days), the drone had to stay within a close proximity of the iPad that was relaying the Myo's commands. There wasn't a way to adjust the vehicle's height in that particular demo, but there's no stopping Thalmic Labs from assigning additional gestures for that -- maybe clench fist to ascend and spread hand to descend. Understandably, neither company could confirm whether they are bringing this feature to market, but we'd be very surprised if they don't sustain their marriage moving forward.

  • Thalmic Labs' Myo armband does gesture control with muscles (video)

    by 
    Richard Lai
    Richard Lai
    01.08.2014

    2013 saw the rise of gesture cameras for TVs and various smart devices, but Canadian startup Thalmic Labs thinks its Myo armband is the way forward. During our meeting at CES earlier, co-founder and CEO Stephen Lake explained that his Bluetooth 4.0 device features a new type of biosensor, which can pick up minute electrical impulses in our arm muscles. This allows any wrist movement, finger twitch or fist clenching to be interpreted as a gesture, so long as the inner side of the Myo has skin contact. There's also an accelerometer, a gyroscope and a magnetometer, so arm and body movements are accounted for as well. The idea of Myo traces back to the co-founders' university days, where they explored various wearable technologies while working on a navigation aid for the blind. Lake said since brain control isn't quite there yet, his team found muscle sensing to be the next best thing. From what we saw and tried today, Thalmic Labs seems to be on the right track: We watched co-founder Aaron Grant play Call Of Duty: Ghosts using just a pair of Myos, and he was able to make his avatar run, crouch, jump, fire weapon and reload. Lake also gave a demo on music playback control and slideshow presentation on an iPad, both of which worked just fine. But it doesn't stop there; the CEO also sees opportunity in industrial robotics, space application and even gesture-based authentication. The retail version of the Myo will arrive within the first half of 2014, and not only will it be half as thick as the Myo Alphas shown today, but it'll also feature at least two core applications that will make full use of the armband. Lake said he'll be showing the final design in the next couple of months, but if you're game, you can now head over to Thalmic Labs' website to pre-order a black or white one for $149. Need more convincing? Then check out our in-depth demo video after the break. Update: We also got to see how you can fly a Parrot AR.Drone 2.0 with a Myo! Check it out.

  • OnTheGo Platforms is bringing gesture recognition to Google Glass apps (video)

    by 
    Alexis Santos
    Alexis Santos
    01.08.2014

    Google Glass can hold its own when it comes to voice recognition and touch, but its current software doesn't account for gesture controls. OnTheGo Platforms, however, is looking to fix that. The folks at the Portland, Ore.-based company are baking up an SDK for developers to integrate gesture recognition in apps made for Glass and other Android-based smart glasses, such as the Vuzix M100. We went hands-on with a demo photo-snapping and gallery app to put the software through its paces. In its current form, the solution recognizes swipes from the left and right, a closed fist and an open hand. A fist aimed at Glass' camera will fire off a countdown for a snapshot or take you to the app's home, depending on the current screen. Waving a hand in either direction cycles through pictures in the gallery. This editor was tempted to swipe his hand across the camera's view quickly, but the software is tuned to pick up slower, more deliberate motions about a foot or so away. The detection was often hit or miss, but the developers say they're in the process of refining the recognition and that they've recently eliminated many false positives.

  • PMD's Nimble UX platform gives computers super accurate touchless gesture controls

    by 
    Michael Gorman
    Michael Gorman
    12.12.2013

    Most of us are familiar with touchless gesture controls, thanks to the efforts of Leap Motion, SoftKinetic, PrimeSense and others. PMD, however, is a name you may not be familiar with, despite the fact that the German firm has been building some of the most accurate and robust depth sensing technology in the world for around a decade. The reason for its low profile? PMD's technology has been used almost exclusively in industrial and automotive settings... until now. The company began exploring consumer products back in 2009, and we saw its first consumer reference design sensor, the CamBoard Pico, last year at Computex. Now, the company's back with its sensor camera and a new gesture control platform for both Mac and PC, called Nimble UX. Nimble has three parts -- the first, as mentioned, is a depth-sensing camera. Next is the Nimble PMD SDK that gives developers access to the depth information gathered by the sensor and tools to help them build gesture-based applications. Finally, there's the Nimble dashboard, which is a plug-and-play bit of software that implements touchless gesture controls for Windows 8. We got to see Nimble UX for ourselves and chat with the folks from PMD to see what sets their technology apart from the competition, so join us after the break to learn more.

  • MIT's 3D motion-tracking tech can see you through walls, no camera needed

    by 
    Joseph Volpe
    Joseph Volpe
    12.11.2013

    Even if you hide behind a wall, MIT's 3D motion-tracking tech can still see you. It can even tell if you've "fallen and can't get up." Sure the tech sounds invasive, but the team's WiTrack (as it's been dubbed) device is actually less intrusive than Microsoft's Kinect -- there's no camera watching your every move. Nor is there any wearable tech involved. Instead, the setup relies solely on a wireless network and your body's ability to bounce back those radio waves.

  • HP adds Leap Motion control to select PCs just in time for the holidays

    by 
    Joseph Volpe
    Joseph Volpe
    12.05.2013

    HP's adding more Leap Motion to its PC arsenal. After refreshing its Envy 17 with inbuilt Leap Motion control, HP's now expanding that partnership to select products in its desktop and all-in-one lines. The 11 devices, which encompass HP's Envy Recline, Phoenix and TouchSmart series, in addition to its Pavilion TouchSmart line, will come packed with the Leap Motion keyboard. But that keyboard would be useless without the necessary Leap Motion software, which HP's made sure to pre-install on the devices, along with a few free apps to demo the gesture control and Airspace, Leap Motion's own app store. If a motion-controlled PC sounds like something you'd want to gift wrap and nestle under the tree, then be sure to check out HP's online shop for the full list of supported devices. Because nothing says, "Merry Christmas!" (or "Happy Holidays," or whatever) like expensive, gimmicky technology.

  • Leap Motion releases Free Form, an app that lets human hands sculpt digital clay (video)

    by 
    Michael Gorman
    Michael Gorman
    11.20.2013

    When we reviewed the Leap Motion controller earlier this year, we found the application selection to be a bit lacking. Since then, the number of apps has doubled from 75 to around 150, and the Airspace store's newest edition is the coolest Leap app we've yet seen. It's called Free Form, and it's a 3D sculpting app (not unlike 3D Systems' Sculpt) built in house at Leap Motion that lets you manipulate and shape digital objects using your fingertips. David Holz, company co-founder and the man who figured out the math behind Leap Motion's technology, gave us a demo of the app and talked a bit about why Leap built it. Additionally, he showed us a new developer beta software that does 360-degree tracking built to address some of the original Leap shortcomings.

  • Google gesture patent would let Glass wearers 'heart' real-world objects

    by 
    Jon Fingas
    Jon Fingas
    10.15.2013

    As it stands, Google Glass doesn't have a simple way of cataloging real-world items -- you have to snap a picture and make a note afterward. It may get much easier if Google implements a newly granted US patent, however. The technique uses a wearable display's camera to detect hand gestures made in front of objects. Make a heart shape and you'll "like" what's front of you; frame something with your fingers and you'll select it. There's no certainty that Glass will ever support these commands, but they're certainly intuitive. If nothing else, they could lead to a new, very literal take on Google Goggles.

  • ARM and eyeSight optimize gesture control for mobile processors

    by 
    Jon Fingas
    Jon Fingas
    10.14.2013

    Hands-free gesture control is no longer a novelty in the mobile space, but the required processing power limits what's currently possible. More sophisticated input may be close at hand, however, as eyeSight has just teamed up with ARM to optimize its gesture control for chips using Mali-T600 series graphics. By relying on the T600's general-purpose computing engine, eyeSight's software can now track 3D movement, facial expressions and finger gestures without a huge performance hit. While companies will have to build supporting apps and devices before we see eyeSight's technology in use, it could lead to Kinect-like control of phones and smart TVs using relatively ordinary silicon.

  • Surface 2 Touch Cover supports gesture control, comes in more colors

    by 
    Joseph Volpe
    Joseph Volpe
    10.08.2013

    Microsoft's Surface 2 is just two weeks away from hitting retail, so in the lead up to that Windows 8.1 tab's launch, the company's released a 'Making of...' video to whet consumers' appetites. As spotted by The Verge, this latest Surface 2 video focuses mainly on the innovations made to the new Touch Cover. Aside from increasing the number of colorful hues it'll be made available in (i.e., green and orange), Microsoft's revealed that this new Touch Cover will be able to support gestures. That's thanks to the new sensor array used which favors a high-resolution matrix over the pressure-sensitive sensors on last-gen's model. Just what exactly those gestures may be, we can't say for sure, as Microsoft hasn't detailed them. But if current reports are any indication, you should be able to trigger that Charm menu by swiping from the right on your Touch Cover's keyboard. Is any of this enough to sway your (credit card holding) hand and get you sign up for Surface 2? Let us know in the comments below and be sure to check out the promo video after the break.

  • Google applies for patent on gesture-based car controls

    by 
    Jon Fingas
    Jon Fingas
    10.03.2013

    So far, in-car gestures aren't useful for much more than raging at the driver that just cut you off. Google wants these gestures to be more productive, however, and has applied for a patent that uses hand motion to control the car itself. Its proposed system relies on both a ceiling-mounted depth camera and a laser scanner to trigger actions based on an occupant's hand positions and movements. Swipe near the window and you'll roll it down; point to the radio and you'll turn the volume up. While there's no guarantee that we'll see the technology in a car, the USPTO is publishing the patent filing just a day after Google has acquired a motion control company. If nothing else, the concept of a Google-powered, gesture-controlled car isn't as far fetched as it used to be.

  • Gesture control startup Flutter acquired by Google, could make Gmail Motion a reality

    by 
    Richard Lawler
    Richard Lawler
    10.02.2013

    Another day, another tech startup gets acquired. This time around it's Google snatching up Y Combinator-hatched Flutter, the developer of a gesture control app for Windows and Mac PCs. There's no word on what it's planning for the team and its technology -- we'd suggest 2011 April Fool's joke Gmail Motion, but someone beat them to that -- but the company's current product uses existing webcams to enable gesture control of software like Spotify, VLC or iTunes. According to CEO Navneet Dalal, users will continue to be able to use the app and should "stay tuned for future updates." Even after Kinect and all of the other gesture control entries we're not sure if it's the future, although creating a solution that has decent precision without requiring extra hardware is interesting. The company's founders told TechCrunch last year that they want Flutter to be the eyes of our computers the way apps like Siri or Google Now are the ears of our device, we'll see if teaming up with Google pushes that movement forward.

  • Toshiba's new dual camera module brings 'deep focus' imaging to smartphones

    by 
    Jon Fingas
    Jon Fingas
    09.26.2013

    Remember when dual camera modules on smartphones were all the rage? Toshiba is bringing them back -- only this time with technology that you're much more likely to use. Its new module uses two 5-megapixel cameras to record depth and images at the same time, producing a "deep focus" picture where everything is sharp. The technique offers a Lytro-like ability to refocus, even after you've taken the shot; it also provides gesture control and very fast digital autofocusing. You'll have to wait a while before you're snapping deep focus vacation photos, though. Toshiba doesn't expect to mass produce the sensors until April, and finished products will likely come later.

  • Gecko Bluetooth tags act as motion and location triggers for your mobile (video)

    by 
    Mariella Moon
    Mariella Moon
    09.20.2013

    What you see above isn't a fancy pick -- it's a gesture control peripheral called Gecko designed to do a lot more than strum a guitar. According to its creators, each action the coin-sized gadget makes can correspond to a phone function, so long as the two are connected via Bluetooth. You could, for instance, configure your device loaded with the accompanying iOS or Android app to make an emergency call whenever you shake Gecko once. However, they claim that it also has many potential offbeat uses, such as notifying you when someone moves your bag or helping you find lost pets, kids or, worse, keys. Of course, that'll only work if you tag your items with it, but anyone with a hyperactive five-year-old wouldn't mind improvising a necklace out of it. Don't expect to find one at a local mall, though -- Gecko's merely an Indiegogo project at the moment, hoping to raise $50,000 to start mass production.