Gesture recognition
Latest
Djay Pro AI for iPad now has touchless gesture controls
An update to Algoriddim’s djay Pro AI is now available and its most notable feature is a touchless Gesture Control interface for iPad Pro and iOS 14.
Jon Turi12.09.2020Intel announces Creative Senz3D Peripheral Camera at Computex 2013
Intel's just announced the Creative Senz3D Peripheral Camera at the company's Computex keynote in Taipei. The camera lets users manipulate objects on the screen using gestures and is able to completely eliminate the background. It appears to be an evolution of the Creative Interactive Gesture Camera we recently played with at IDF in Beijing. This new 3D depth camera is expected to become available next quarter and Intel plans to incorporate the technology into devices during the second half of 2014. "It's like adding two eyes to my system," said Tom Kilroy, VP of marketing. The company's been talking about "perceptual computing" for some time and this certainly brings the idea one step closer to fruition.
Myriam Joire06.04.2013SoftKinetic's motion sensor tracks your hands and fingers, fits in them too (video)
Coming out of its shell as a possible Kinect foe, SoftKinetic has launched a new range sensor at Computex right on the heels of its last model. Upping the accuracy while shrinking the size, the DepthSense 325 now sees your fingers and hand gestures in crisp HD and as close as 10cm (4 inches), an improvement from the 15cm (6 inches) of its DS311 predecessor. Two microphones are also tucked in, making the device suitable for video conferencing, gaming and whatever else OEMs and developers might have in mind. We haven't tried it yet, but judging from the video, it seems to hunt finger and hand movements quite competently. Hit the break to see for yourself. Show full PR text SoftKinetic Announces World's Smallest HD Gesture Recognition Camera and Releases Far and Close Interaction Middleware Professional Kit Available For Developers To Start Building a New Generation of Gesture-Based Experiences TAIPEI & BRUSSELS – June 5, 2012 – SoftKinetic, the pioneering provider of 3D vision and gesture recognition technology, today announced a device that will revolutionize the way people interact with their PCs. The DepthSense 325 (DS325), a pocket-sized camera that sees both in 3D (depth) and high-definition 2D (color), delivered as a professional kit, will enable developers to incorporate high-quality finger and hand tracking for PC video games, introduce new video conferencing experiences and many other immersive PC applications. The DS325 can operate from as close as 10cm and includes a high-resolution depth sensor with a wide field of view, combined with HD video and dual microphones. In addition, the company announced the general availability of iisu™ 3.5, its acclaimed gesture-recognition middleware compatible with most 3D sensors available on the market. In addition of its robust full body tracking features, iisu 3.5 now offers the capacity to accurately track users' individual fingers at 60 frames per second, opening up a new world of close-range applications. "SoftKinetic is proud to release these revolutionary products to developers and OEMs," said Michel Tombroff, CEO of SoftKinetic. "The availability of iisu 3.5 and the DS325 clearly marks a milestone for the 3D vision and gesture recognition markets. These technologies will enable new generations of video games, edutainment applications, video conference, virtual shopping, media browsing, social media connectivity and more." SoftKinetic will demonstrate the new PC and SmartTV experiences and at its booth at Computex, June 5-9, 2012, in the NanGang Expo Hall, Upper Level, booth N1214. For business appointments, send a meeting request to events@softkinetic.com. The DS325 Professional Kit is available for pre-order now at SoftKinetic's online store (http://www.softkinetic.com/Store.aspx) and is expected to begin shipping in the coming weeks. iisu 3.5 Software Development Kit is available free for non-commercial use at SoftKinetic's online store (http://www.softkinetic.com/Store.aspx) and at iisu.com. About SoftKinetic S.A. SoftKinetic's vision is to transform the way people interact with the digital world. SoftKinetic is the leading provider of gesture-based platforms for the consumer electronics and professional markets. The company offers a complete family of 3D imaging and gesture recognition solutions, including patented 3D CMOS time-of-flight sensors and cameras (DepthSense™ family of products, formerly known as Optrima product family), multi-platform and multi-camera 3D gesture recognition middleware and tools (iisu™ family of products) as well as games and applications from SoftKinetic Studios. With over 8 years of R&D on both hardware and software, SoftKinetic solutions have already been successfully used in the field of interactive digital entertainment, consumer electronics, health care and other professional markets (such as digital signage and medical systems). SoftKinetic, iisu, DepthSense and The Interface Is You are trade names or registered trademarks of SoftKinetic. For more information on SoftKinetic please visit www.softkinetic.com. For videos of SoftKinetic-related products visit SoftKinetic's YouTube channel: www.youtube.com/SoftKinetic.
Steve Dent06.06.2012Hillcrest Labs takes its TV motion control system to China, becomes TCL's new best friend
It's only been a few days since Hillcrest Labs open sourced its Kylo web browser for TVs, and now the company's back with yet another announcement. Well, this time it's more about TCL who's just declared its top TV market share in China. Much like the Roku 2 and LG TVs with Magic Motion remote, Hillcrest's Freespace engine has been outted as the enabling technology behind TCL's recently announced V7500, a 3D smart TV series featuring a heavily customized Android 4.0.3 and a 7.9mm-thick bezel. This means users can interact with and play games on this slim TV via motion and cursor control on the remote (there's also voice control here but it doesn't look like Hillcrest has anything to do with it). There are no dates or prices just yet, but TCL better be quick as Lenovo's got something very similar ready to ship soon.
Richard Lai05.23.2012Huawei throws R&D dollars at gesture control, cloud storage, being more 'disruptive'
Undeterred by the fact that even humans struggle to interpret certain gestures, Huawei says it's allocating a chunk of its growing R&D budget to new motion-sensing technology for smartphones and tablets. The company's North American research chief, John Roese, told Computerworld that he wants to allow "three-dimensional interaction" with devices using stereo front-facing cameras and a powerful GPU to make sense of the dual video feed. Separately, the Chinese telecoms company is also putting development cash into a cloud computing project that promises to "change the economics of storage by an order of magnitude." Roese provided scant few details on this particular ambition, but did mention that Huawei has teamed up with CERN to conduct research and has somehow accumulated over 15 petabytes of experimental physics data in the process. Whatever it's up to, Huawei had better get a move on -- others are snapping up gesture recognition and cloud patents faster than you can say fa te ne una bicicletta with your hands.
Sharif Sakr04.30.2012Qualcomm's Snapdragon S4 flexes its imaging muscle (video)
While we already know that Qualcomm's Snapdragon S4 will offer a quad-core variant, incorporate LTE and use a 28nm manufacturing process, the company posted an article to its media blog ahead of Mobile World Congress showcasing the new features provided by the chipset's Image Signal Processor. You're likely familiar with some of the imaging functionality available in Qualcomm's existing Snapdragon processors -- technology like Scalado's Rewind (pictured above) which we've covered before. The new SoC cranks things up a notch with support for up to three cameras (two in the back for 3D plus one front-facing), 20-megapixel sensors and 1080p HD video recording at 30fps. In addition to zero shutter lag, the Snapdragon S4 includes proprietary 3A processing (autofocus, auto exposure and auto white balance) along with improved blink / smile detection, gaze estimation, range finding and image stabilization. Rounding things off are gesture detection / control, augmented reality and computer vision (via Quacomm's FastCV). Want to know more? Check out the source link below, then hit the break for video demos of the S4's image stabilization and gesture-based imaging chops.
Myriam Joire02.18.2012Hisense Series XT710 TV helps you exercise your La-Z-Boy sans remote
You might feel like the king of the couch, but let's face it -- picking up the remote can be exhausting. Hisense is hoping to lighten your load with the launch of its new Android-based Smart TV with hands-free eyeSight gesture recognition technology -- the Series XT710. Slated to launch in China, the TV features a 2D sensor, designed to understand your hand movements and interpret your every channel changing whim. Besides flipping between reruns of Law and Order and Jersey Shore, couch potatoes will also be able to play games and access Android applications through the intelligent tube. Now, if it could only help us pop our popcorn. Jump past the break to check out the full PR.
Lydia Leavitt10.26.2011Apple '3D imaging and display' patent was cutting edge in 2005
An Apple patent for a "3D imaging and display system" staggers out into daylight after seven years buried in the USPTO. Its eyes steadily adjust to the brightness of a Kinect-dominated world and its heart sinks. But then a random guy approaches and says, "Hey little patent, what's wrong?" "I'm obsolete," comes the sullen reply. "I'm all about detecting user movements in three dimensions, but the competition has that covered. Sure, people might *think* I've patented some kind of wild holographic virtual reality stuff too, but my paperwork only mentions that in the vaguest possible terms. There's no way I can threaten Microsoft." "Nonsense!" cries the guy. "Follow me. I know a judge in Düsseldorf."
Sharif Sakr09.15.2011Kinect Hacks Daily, Episode 47: Kinect taught to control XBMC through hand gestures
One of our favorite parts of Kinect, at least theoretically, was the idea of controller-free and remote-free control of the dashboard and media playbook. Sure, it's never going to be as optimized and snappy as those tried-and-true digital buttons, but it's a great party trick, and we're all about the party tricks. Well, now you can get some of that gesture mojo going on with your XBMC setup -- and we're guessing eventually you'll be able to control just about anything else you'd use basic "left, right, click" actions for navigation. Our only suggestion? Get some of that Dance Central-style menu navigation going on here. That goes for you, too, Microsoft. [Thanks, Joshua]
Paul Miller12.15.2010Glowing Pathfinder Bugs installation puts the 'Minority Report' interface to good use - in a sand box (video)
Nestled among the various booths at SIGGRAPH 2010 visitors got to check out a unique installation called Glowing Pathfinder Bugs. Created by Squidsoup and Anthony Rowe, this interactive art piece uses projectors to place "bugs" made out of light in a sandbox, coupled with a 3D gesture-based interface that allows people to pick up, move, and even breed the creatures. The system even takes the topography of the sand itself into consideration: altering the sand will alter the bug's path. It's nice to see someone put an interface technology to good use for a change! Video after the break.
Joseph L. Flatley07.30.2010Fraunhofer FIT touch-free gesture-control for multiple users (video)
It seems like everyone is cooking up their own touch-free gesture-based control technology, just like every blogger is destined to refer to it as "Minority Report-like" or "Minority Report-esque," or "Tom Cruise-tastic!" Fraunhofer's FIT, the newest such project, has recently appeared on the YouTubes, where we must say it looks pretty darn good. Not only does it not require special gloves or markers, this thing also works in real time and can support multiple users (and multiple fingers). The researchers hope to use this for working with complex simulation data and in education, although there are some kinks to be worked out: currently elements like the reflections caused by wristwatches and the orientation of the palm confuses the system. That said, the demo is pretty rad! See for yourself after the break.
Joseph L. Flatley07.21.2010Microsoft hints at touchless Surface combining camera and transparent OLED (video)
We've always wondered whether Microsoft's multitouch table would actually ever arrive, dreaming of Minority Report hijinx all the while, but after seeing what the company's Applied Sciences Group is currently cooking up -- a touchless telepresence display -- we'd rather drop that antiquated pinch-to-zoom stuff in favor of what might be Surface's next generation. Starting with one of Samsung's prototype transparent OLED panels, Microsoft dropped a sub-two-inch camera behind the glass, creating a 3D gesture control interface that tracks your every move by literally seeing through the display. Combined with that proprietary wedge-shaped lens we saw earlier this month and some good ol' Johnny Chung Lee headtracking by the man himself, we're looking at one hell of a screen. Don't you dare read another word without seeing the prototype in a trifecta of videos after the break.
Sean Hollister06.29.2010Kinect tech destined for TV-embedded greatness in 2011, HTPC integration later this year
From Tel Aviv unknown to Xbox gaming wunderkind, PrimeSense has already had quite a run, but the camera-control tech that powers Kinect is destined for new applications before long. VP Adi Berenson tells us the company's already signed deals to put PrimeSense inside HTPCs by the end of the year, and has at least one cable company ready to launch a gesture-controlled set top box by summer 2011. The end goal is to provide natural human control over TV-based media consumption, said Berenson, who's working to get cameras in TVs themselves sometime late next year. Like Kinect, these solutions will have a pair of 640 x 480 camera sensors to measure user position in 3D space, but don't expect them to have motorized tilt functionality or voice recognition -- PrimeSense said it won't be able to make those available for manufacturers, as they're all Microsoft ideas. The gesture recognition has reportedly evolved, though, and we're eager to check that out soon. See what it used to look like in our GDC 2010 preview. Update: Just to be absolutely clear, this is not Microsoft's Kinect that's slated for an HTPC and set-top-box near you, but rather PrimeSense, the 3D camera sensor technology behind it.
Sean Hollister06.23.2010Fujitsu's motion sensing laptop interface makes no sense (video)
We're not sure what Fujitsu is thinking here, but it has to stop. Get a load of its motion control interface running on a 15.6-inch laptop. Yes, a 15-inch laptop. We might be able to understand this if it was plugged into a big flat panel television or projector, but trying to manipulate those itty bitty controls from 10-feet away is, well, silly. The Core i3-350M-powered Fujitsu LifeBook AH700/5A does feature HDMI-out but you still have to place the laptop in front of you (and the TV) with the display popped open so that the camera can see your movements. On a positive note, it looks like a great way to develop your wax-on / wax-off ninja tuna skills.
Thomas Ricker06.09.2010Microsoft Research toys with the cosmos... using forefinger and thumb (video)
We've always been suckers for Minority Report tech, and Microsoft Research's latest attempt is not to be missed. Thought pinch-to-zoom was quaint? Try pinching the sky in this geodesic dome. Though the cardboard-and-paper-clip structure isn't all that (unless you're the arts and crafts type), the inside houses a projectiondesign DLP unit with a custom infrared camera system that can turn simple hand gestures into virtual interstellar travel, 360-degree video teleconferencing and more. You'll find a pair of videos demonstrating the concept after the break, but try not to get too attached -- if you're anything like us, your poor heart can't handle another Courier axing.
Sean Hollister05.31.2010Texas Instruments introduces ARM-based OMAP 4 SOC, Blaze development platform
Texas Instruments has just made its OMAP 4 system-on-chip official, and garnished the announcement with the first development platform for it, aggressively titled Blaze. We already caught a glimpse of it in prototype form earlier this month, and the thing is quite a whopper -- you can see it on video after the break and we doubt you'll accuse TI of placing form before function with this one. The company's focus will be on promoting innovative new modes of interaction, with touchless gesturing (or "in the air" gesture recognition) figuring strongly in its vision of the future. Looking at the SOC diagram (available after the break), you'll find that its grunt will be provided by the same ARM Cortex-A9 MPCore class of CPU that powers the iPad, though TI claims it will be the only mobile platform capable of outputting stereoscopic 720p video at 30fps per channel. Perhaps its uniqueness will come from the fact that nobody else cares for the overkill that is 3D-HD on a mobile phone, whether it requires glasses or not. It'll still be fascinating to see if anybody picks up the chunky Blaze idea and tries to produce a viable mobile device out of it -- we could be convinced we need multiple displays while on the move, we're just not particularly hot on the 90s style bezel overflow.
Vlad Savov02.15.2010Harvard and MIT researchers working to simulate the visual cortex to give computers true sight
It sounds like a daunting task, but some researchers at Harvard and MIT have banded together to basically "reverse engineer" the human brain's ability to process visual data into usable information. However, instead of testing one processing model at a time, they're using a screening technique borrowed from molecular biology to test a range of thousands of models up against particular object recognition tasks. To get the computational juice to accomplish this feat, they've been relying heavily on GPUs, saying the off-the-shelf parallel computing setup they've got gives them hundred-fold speed improvements over conventional methods. So far they claim their results are besting "state-of-the-art computer vision systems" (which, if iPhoto's skills are any indication, wouldn't take much), and they hope to not only improve tasks such as face recognition, object recognition and gesture tracking, but also to apply their knowledge back into a better understanding of the brain's mysterious machinations. A delicious cycle! There's a video overview of their approach after the break. [Thanks, David]
Paul Miller12.04.2009Hands-on / video with the LG.Philips massive 52-inch multi-touch display
We just got back from the super sneaky secret LG.Philips room at CES where the totally Surface-esque 52-inch multitouch display was being shown off. The 1920 x 1080 screen rocks an interesting infrared image sensor to get data about hand placement and movement, and is capable of doing all kinds of gesture and area recognition from two separate touch points. Check the gallery to get a better view, and watch the video if you're excited about the prospect of a flipping, zooming Google Earth on a screen with multitouch.%Gallery-13146%
Joshua Topolsky01.09.2008Is this Windows Mobile 7?
We know the coverage from Microsoft's Mobius got many mobile fans a little hot under the collar, well we're about to blow the doors off that. The guys over at Engadget have gotten their mitts on some screenies and details on device interaction from what may be Windows Mobile 7. Hit the read link to see dozens of hot pics and a pile more info.
Sean Cooper01.06.2008Microsoft to unveil 'PlayTable' gesture-based interface at D?
If ZDnet staple Mary Jo Foley (or more specifically, her source) is right, we may be seeing Microsoft take the next big step in device interaction at tomorrow's D: All Things Digital Conference, with the perennial Redmond watcher predicting an official unveiling of the company's PlayTable / Project Milan multi-touch, gesture-based input technology. PlayTable, which combines elements we've seen in the iPhone, from NYU's Jeff Han, in various prototype devices, and from Microsoft's own, recently-demo'ed DigiDesk, is envisioned as a multi-purpose interface that can be employed in anything from a DAP (Zune) to a cellphone (WinMo handsets) to a gaming console (Xbox) -- so it's no coincidence that the project is being developed by the same Mobile and Entertainment division that's also in charge of these categories. All in all, tomorrow promises to be a big day: not only is Palm making a potentially breakthrough announcement, and the faces of Apple and Microsoft scheduled to appear together on the same stage, but we may also get to witness computing history to boot; man, if we had real jobs, we'd be calling in sick in a heartbeat. [Via Scobleizer]
Evan Blass05.29.2007