GestureRecognition

Latest

  • Muscle-sensing Myo gesture armband will be on Amazon this quarter

    by 
    Richard Lai
    Richard Lai
    01.19.2015

    About a year after getting our first taste of the Myo, Thalmic Labs has announced that it's reaching out to the masses by way of Amazon this quarter. As with its pre-order on the company's website, this muscle-sensing gesture control armband will be available for $199. But even before that, Thalmic Labs has already sold 50,000 pre-orders (with about half of them shipped to buyers so far), which is a nice nod of approval to some of the use case examples showed off by Thalmic Labs and its several partners.

  • OnTheGo Platforms is bringing gesture recognition to Google Glass apps (video)

    by 
    Alexis Santos
    Alexis Santos
    01.08.2014

    Google Glass can hold its own when it comes to voice recognition and touch, but its current software doesn't account for gesture controls. OnTheGo Platforms, however, is looking to fix that. The folks at the Portland, Ore.-based company are baking up an SDK for developers to integrate gesture recognition in apps made for Glass and other Android-based smart glasses, such as the Vuzix M100. We went hands-on with a demo photo-snapping and gallery app to put the software through its paces. In its current form, the solution recognizes swipes from the left and right, a closed fist and an open hand. A fist aimed at Glass' camera will fire off a countdown for a snapshot or take you to the app's home, depending on the current screen. Waving a hand in either direction cycles through pictures in the gallery. This editor was tempted to swipe his hand across the camera's view quickly, but the software is tuned to pick up slower, more deliberate motions about a foot or so away. The detection was often hit or miss, but the developers say they're in the process of refining the recognition and that they've recently eliminated many false positives.

  • Intel announces Creative Senz3D Peripheral Camera at Computex 2013

    by 
    Myriam Joire
    Myriam Joire
    06.04.2013

    Intel's just announced the Creative Senz3D Peripheral Camera at the company's Computex keynote in Taipei. The camera lets users manipulate objects on the screen using gestures and is able to completely eliminate the background. It appears to be an evolution of the Creative Interactive Gesture Camera we recently played with at IDF in Beijing. This new 3D depth camera is expected to become available next quarter and Intel plans to incorporate the technology into devices during the second half of 2014. "It's like adding two eyes to my system," said Tom Kilroy, VP of marketing. The company's been talking about "perceptual computing" for some time and this certainly brings the idea one step closer to fruition.

  • BlackBerry granted gesture recognition patent for touch-free image manipulation

    by 
    Joseph Volpe
    Joseph Volpe
    02.19.2013

    If BlackBerry lives to see 2014 (and beyond), it could end up delighting smartphone users with some neat gesture recognition tech. In a recently surfaced patent filing, the company formerly known as RIM outlines a method for selecting onscreen images using hand or finger movements above a display. By synthesizing a combo of images -- one taken with IR, the other without -- the software would be able to determine the intended area of selection. And just in case there was any doubt this feature would be headed to smartphones and tablets, the docs go on to specify its use within "a mobile communications device, comprising: a digital camera... [and] a cellular subsystem." So there you have it -- you'll either potentially see this hands-off editing tool pop up in future BB devices or BB simply stands to make a some nice coin in licensing fees.

  • SoftKinetic's motion sensor tracks your hands and fingers, fits in them too (video)

    by 
    Steve Dent
    Steve Dent
    06.06.2012

    Coming out of its shell as a possible Kinect foe, SoftKinetic has launched a new range sensor at Computex right on the heels of its last model. Upping the accuracy while shrinking the size, the DepthSense 325 now sees your fingers and hand gestures in crisp HD and as close as 10cm (4 inches), an improvement from the 15cm (6 inches) of its DS311 predecessor. Two microphones are also tucked in, making the device suitable for video conferencing, gaming and whatever else OEMs and developers might have in mind. We haven't tried it yet, but judging from the video, it seems to hunt finger and hand movements quite competently. Hit the break to see for yourself. Show full PR text SoftKinetic Announces World's Smallest HD Gesture Recognition Camera and Releases Far and Close Interaction Middleware Professional Kit Available For Developers To Start Building a New Generation of Gesture-Based Experiences TAIPEI & BRUSSELS – June 5, 2012 – SoftKinetic, the pioneering provider of 3D vision and gesture recognition technology, today announced a device that will revolutionize the way people interact with their PCs. The DepthSense 325 (DS325), a pocket-sized camera that sees both in 3D (depth) and high-definition 2D (color), delivered as a professional kit, will enable developers to incorporate high-quality finger and hand tracking for PC video games, introduce new video conferencing experiences and many other immersive PC applications. The DS325 can operate from as close as 10cm and includes a high-resolution depth sensor with a wide field of view, combined with HD video and dual microphones. In addition, the company announced the general availability of iisu™ 3.5, its acclaimed gesture-recognition middleware compatible with most 3D sensors available on the market. In addition of its robust full body tracking features, iisu 3.5 now offers the capacity to accurately track users' individual fingers at 60 frames per second, opening up a new world of close-range applications. "SoftKinetic is proud to release these revolutionary products to developers and OEMs," said Michel Tombroff, CEO of SoftKinetic. "The availability of iisu 3.5 and the DS325 clearly marks a milestone for the 3D vision and gesture recognition markets. These technologies will enable new generations of video games, edutainment applications, video conference, virtual shopping, media browsing, social media connectivity and more." SoftKinetic will demonstrate the new PC and SmartTV experiences and at its booth at Computex, June 5-9, 2012, in the NanGang Expo Hall, Upper Level, booth N1214. For business appointments, send a meeting request to events@softkinetic.com. The DS325 Professional Kit is available for pre-order now at SoftKinetic's online store (http://www.softkinetic.com/Store.aspx) and is expected to begin shipping in the coming weeks. iisu 3.5 Software Development Kit is available free for non-commercial use at SoftKinetic's online store (http://www.softkinetic.com/Store.aspx) and at iisu.com. About SoftKinetic S.A. SoftKinetic's vision is to transform the way people interact with the digital world. SoftKinetic is the leading provider of gesture-based platforms for the consumer electronics and professional markets. The company offers a complete family of 3D imaging and gesture recognition solutions, including patented 3D CMOS time-of-flight sensors and cameras (DepthSense™ family of products, formerly known as Optrima product family), multi-platform and multi-camera 3D gesture recognition middleware and tools (iisu™ family of products) as well as games and applications from SoftKinetic Studios. With over 8 years of R&D on both hardware and software, SoftKinetic solutions have already been successfully used in the field of interactive digital entertainment, consumer electronics, health care and other professional markets (such as digital signage and medical systems). SoftKinetic, iisu, DepthSense and The Interface Is You are trade names or registered trademarks of SoftKinetic. For more information on SoftKinetic please visit www.softkinetic.com. For videos of SoftKinetic-related products visit SoftKinetic's YouTube channel: www.youtube.com/SoftKinetic.

  • Hillcrest Labs takes its TV motion control system to China, becomes TCL's new best friend

    by 
    Richard Lai
    Richard Lai
    05.23.2012

    It's only been a few days since Hillcrest Labs open sourced its Kylo web browser for TVs, and now the company's back with yet another announcement. Well, this time it's more about TCL who's just declared its top TV market share in China. Much like the Roku 2 and LG TVs with Magic Motion remote, Hillcrest's Freespace engine has been outted as the enabling technology behind TCL's recently announced V7500, a 3D smart TV series featuring a heavily customized Android 4.0.3 and a 7.9mm-thick bezel. This means users can interact with and play games on this slim TV via motion and cursor control on the remote (there's also voice control here but it doesn't look like Hillcrest has anything to do with it). There are no dates or prices just yet, but TCL better be quick as Lenovo's got something very similar ready to ship soon.

  • Huawei throws R&D dollars at gesture control, cloud storage, being more 'disruptive'

    by 
    Sharif Sakr
    Sharif Sakr
    04.30.2012

    Undeterred by the fact that even humans struggle to interpret certain gestures, Huawei says it's allocating a chunk of its growing R&D budget to new motion-sensing technology for smartphones and tablets. The company's North American research chief, John Roese, told Computerworld that he wants to allow "three-dimensional interaction" with devices using stereo front-facing cameras and a powerful GPU to make sense of the dual video feed. Separately, the Chinese telecoms company is also putting development cash into a cloud computing project that promises to "change the economics of storage by an order of magnitude." Roese provided scant few details on this particular ambition, but did mention that Huawei has teamed up with CERN to conduct research and has somehow accumulated over 15 petabytes of experimental physics data in the process. Whatever it's up to, Huawei had better get a move on -- others are snapping up gesture recognition and cloud patents faster than you can say fa te ne una bicicletta with your hands.

  • Qualcomm's Snapdragon S4 flexes its imaging muscle (video)

    by 
    Myriam Joire
    Myriam Joire
    02.18.2012

    While we already know that Qualcomm's Snapdragon S4 will offer a quad-core variant, incorporate LTE and use a 28nm manufacturing process, the company posted an article to its media blog ahead of Mobile World Congress showcasing the new features provided by the chipset's Image Signal Processor. You're likely familiar with some of the imaging functionality available in Qualcomm's existing Snapdragon processors -- technology like Scalado's Rewind (pictured above) which we've covered before. The new SoC cranks things up a notch with support for up to three cameras (two in the back for 3D plus one front-facing), 20-megapixel sensors and 1080p HD video recording at 30fps. In addition to zero shutter lag, the Snapdragon S4 includes proprietary 3A processing (autofocus, auto exposure and auto white balance) along with improved blink / smile detection, gaze estimation, range finding and image stabilization. Rounding things off are gesture detection / control, augmented reality and computer vision (via Quacomm's FastCV). Want to know more? Check out the source link below, then hit the break for video demos of the S4's image stabilization and gesture-based imaging chops.

  • Hisense Series XT710 TV helps you exercise your La-Z-Boy sans remote

    by 
    Lydia Leavitt
    Lydia Leavitt
    10.26.2011

    You might feel like the king of the couch, but let's face it -- picking up the remote can be exhausting. Hisense is hoping to lighten your load with the launch of its new Android-based Smart TV with hands-free eyeSight gesture recognition technology -- the Series XT710. Slated to launch in China, the TV features a 2D sensor, designed to understand your hand movements and interpret your every channel changing whim. Besides flipping between reruns of Law and Order and Jersey Shore, couch potatoes will also be able to play games and access Android applications through the intelligent tube. Now, if it could only help us pop our popcorn. Jump past the break to check out the full PR.

  • Apple '3D imaging and display' patent was cutting edge in 2005

    by 
    Sharif Sakr
    Sharif Sakr
    09.15.2011

    An Apple patent for a "3D imaging and display system" staggers out into daylight after seven years buried in the USPTO. Its eyes steadily adjust to the brightness of a Kinect-dominated world and its heart sinks. But then a random guy approaches and says, "Hey little patent, what's wrong?" "I'm obsolete," comes the sullen reply. "I'm all about detecting user movements in three dimensions, but the competition has that covered. Sure, people might *think* I've patented some kind of wild holographic virtual reality stuff too, but my paperwork only mentions that in the vaguest possible terms. There's no way I can threaten Microsoft." "Nonsense!" cries the guy. "Follow me. I know a judge in Düsseldorf."

  • Kinect Hacks Daily, Episode 47: Kinect taught to control XBMC through hand gestures

    by 
    Paul Miller
    Paul Miller
    12.15.2010

    One of our favorite parts of Kinect, at least theoretically, was the idea of controller-free and remote-free control of the dashboard and media playbook. Sure, it's never going to be as optimized and snappy as those tried-and-true digital buttons, but it's a great party trick, and we're all about the party tricks. Well, now you can get some of that gesture mojo going on with your XBMC setup -- and we're guessing eventually you'll be able to control just about anything else you'd use basic "left, right, click" actions for navigation. Our only suggestion? Get some of that Dance Central-style menu navigation going on here. That goes for you, too, Microsoft. [Thanks, Joshua]

  • Glowing Pathfinder Bugs installation puts the 'Minority Report' interface to good use - in a sand box (video)

    by 
    Joseph L. Flatley
    Joseph L. Flatley
    07.30.2010

    Nestled among the various booths at SIGGRAPH 2010 visitors got to check out a unique installation called Glowing Pathfinder Bugs. Created by Squidsoup and Anthony Rowe, this interactive art piece uses projectors to place "bugs" made out of light in a sandbox, coupled with a 3D gesture-based interface that allows people to pick up, move, and even breed the creatures. The system even takes the topography of the sand itself into consideration: altering the sand will alter the bug's path. It's nice to see someone put an interface technology to good use for a change! Video after the break.

  • Fraunhofer FIT touch-free gesture-control for multiple users (video)

    by 
    Joseph L. Flatley
    Joseph L. Flatley
    07.21.2010

    It seems like everyone is cooking up their own touch-free gesture-based control technology, just like every blogger is destined to refer to it as "Minority Report-like" or "Minority Report-esque," or "Tom Cruise-tastic!" Fraunhofer's FIT, the newest such project, has recently appeared on the YouTubes, where we must say it looks pretty darn good. Not only does it not require special gloves or markers, this thing also works in real time and can support multiple users (and multiple fingers). The researchers hope to use this for working with complex simulation data and in education, although there are some kinks to be worked out: currently elements like the reflections caused by wristwatches and the orientation of the palm confuses the system. That said, the demo is pretty rad! See for yourself after the break.

  • Microsoft hints at touchless Surface combining camera and transparent OLED (video)

    by 
    Sean Hollister
    Sean Hollister
    06.29.2010

    We've always wondered whether Microsoft's multitouch table would actually ever arrive, dreaming of Minority Report hijinx all the while, but after seeing what the company's Applied Sciences Group is currently cooking up -- a touchless telepresence display -- we'd rather drop that antiquated pinch-to-zoom stuff in favor of what might be Surface's next generation. Starting with one of Samsung's prototype transparent OLED panels, Microsoft dropped a sub-two-inch camera behind the glass, creating a 3D gesture control interface that tracks your every move by literally seeing through the display. Combined with that proprietary wedge-shaped lens we saw earlier this month and some good ol' Johnny Chung Lee headtracking by the man himself, we're looking at one hell of a screen. Don't you dare read another word without seeing the prototype in a trifecta of videos after the break.

  • Kinect tech destined for TV-embedded greatness in 2011, HTPC integration later this year

    by 
    Sean Hollister
    Sean Hollister
    06.23.2010

    From Tel Aviv unknown to Xbox gaming wunderkind, PrimeSense has already had quite a run, but the camera-control tech that powers Kinect is destined for new applications before long. VP Adi Berenson tells us the company's already signed deals to put PrimeSense inside HTPCs by the end of the year, and has at least one cable company ready to launch a gesture-controlled set top box by summer 2011. The end goal is to provide natural human control over TV-based media consumption, said Berenson, who's working to get cameras in TVs themselves sometime late next year. Like Kinect, these solutions will have a pair of 640 x 480 camera sensors to measure user position in 3D space, but don't expect them to have motorized tilt functionality or voice recognition -- PrimeSense said it won't be able to make those available for manufacturers, as they're all Microsoft ideas. The gesture recognition has reportedly evolved, though, and we're eager to check that out soon. See what it used to look like in our GDC 2010 preview. Update: Just to be absolutely clear, this is not Microsoft's Kinect that's slated for an HTPC and set-top-box near you, but rather PrimeSense, the 3D camera sensor technology behind it.

  • Fujitsu's motion sensing laptop interface makes no sense (video)

    by 
    Thomas Ricker
    Thomas Ricker
    06.09.2010

    We're not sure what Fujitsu is thinking here, but it has to stop. Get a load of its motion control interface running on a 15.6-inch laptop. Yes, a 15-inch laptop. We might be able to understand this if it was plugged into a big flat panel television or projector, but trying to manipulate those itty bitty controls from 10-feet away is, well, silly. The Core i3-350M-powered Fujitsu LifeBook AH700/5A does feature HDMI-out but you still have to place the laptop in front of you (and the TV) with the display popped open so that the camera can see your movements. On a positive note, it looks like a great way to develop your wax-on / wax-off ninja tuna skills.

  • Microsoft Research toys with the cosmos... using forefinger and thumb (video)

    by 
    Sean Hollister
    Sean Hollister
    05.31.2010

    We've always been suckers for Minority Report tech, and Microsoft Research's latest attempt is not to be missed. Thought pinch-to-zoom was quaint? Try pinching the sky in this geodesic dome. Though the cardboard-and-paper-clip structure isn't all that (unless you're the arts and crafts type), the inside houses a projectiondesign DLP unit with a custom infrared camera system that can turn simple hand gestures into virtual interstellar travel, 360-degree video teleconferencing and more. You'll find a pair of videos demonstrating the concept after the break, but try not to get too attached -- if you're anything like us, your poor heart can't handle another Courier axing.

  • Texas Instruments introduces ARM-based OMAP 4 SOC, Blaze development platform

    by 
    Vlad Savov
    Vlad Savov
    02.15.2010

    Texas Instruments has just made its OMAP 4 system-on-chip official, and garnished the announcement with the first development platform for it, aggressively titled Blaze. We already caught a glimpse of it in prototype form earlier this month, and the thing is quite a whopper -- you can see it on video after the break and we doubt you'll accuse TI of placing form before function with this one. The company's focus will be on promoting innovative new modes of interaction, with touchless gesturing (or "in the air" gesture recognition) figuring strongly in its vision of the future. Looking at the SOC diagram (available after the break), you'll find that its grunt will be provided by the same ARM Cortex-A9 MPCore class of CPU that powers the iPad, though TI claims it will be the only mobile platform capable of outputting stereoscopic 720p video at 30fps per channel. Perhaps its uniqueness will come from the fact that nobody else cares for the overkill that is 3D-HD on a mobile phone, whether it requires glasses or not. It'll still be fascinating to see if anybody picks up the chunky Blaze idea and tries to produce a viable mobile device out of it -- we could be convinced we need multiple displays while on the move, we're just not particularly hot on the 90s style bezel overflow.

  • Harvard and MIT researchers working to simulate the visual cortex to give computers true sight

    by 
    Paul Miller
    Paul Miller
    12.04.2009

    It sounds like a daunting task, but some researchers at Harvard and MIT have banded together to basically "reverse engineer" the human brain's ability to process visual data into usable information. However, instead of testing one processing model at a time, they're using a screening technique borrowed from molecular biology to test a range of thousands of models up against particular object recognition tasks. To get the computational juice to accomplish this feat, they've been relying heavily on GPUs, saying the off-the-shelf parallel computing setup they've got gives them hundred-fold speed improvements over conventional methods. So far they claim their results are besting "state-of-the-art computer vision systems" (which, if iPhoto's skills are any indication, wouldn't take much), and they hope to not only improve tasks such as face recognition, object recognition and gesture tracking, but also to apply their knowledge back into a better understanding of the brain's mysterious machinations. A delicious cycle! There's a video overview of their approach after the break. [Thanks, David]

  • Hands-on / video with the LG.Philips massive 52-inch multi-touch display

    by 
    Joshua Topolsky
    Joshua Topolsky
    01.09.2008

    We just got back from the super sneaky secret LG.Philips room at CES where the totally Surface-esque 52-inch multitouch display was being shown off. The 1920 x 1080 screen rocks an interesting infrared image sensor to get data about hand placement and movement, and is capable of doing all kinds of gesture and area recognition from two separate touch points. Check the gallery to get a better view, and watch the video if you're excited about the prospect of a flipping, zooming Google Earth on a screen with multitouch.%Gallery-13146%