gesturecontrol
Latest
Exclusive: PMD's CamBoard Pico XS is the tiniest gesture camera we've ever seen (video)
Just as we were wrapping up CES today, we caught up with our friends over at PMD Technologies who surprised us with a little exclusive. What you see above is the new CamBoard Pico XS gesture camera that's dwarfed by the Pico S -- the one we saw at Computex -- next to it. This tiny module is only 4mm thick, 39.5mm long and 15.5mm wide, making it 1.5mm thinner and almost half as long as its predecessor, while still packing the same Infineon IRS1010C 3D image sensor chip. Given the size plus the fact that it already uses MIPI (Mobile Industry Processor Interface) instead of USB, the Pico XS is truly integration-ready for OEMs. The main changes that enabled this size reduction are the smaller lens -- which is compensated by a sharper laser illumination (but still 850nm infrared) -- plus the removal of the aluminum heat sink (which is actually the chassis), courtesy of a much lower power consumption. Instead of a typical 1W you get on the Pico S, the Pico XS requires less than 50mW typically (at 25fps) and 350mW max (up to 45fps). Temperature-wise it goes up by just 10 degrees Celsius at most, apparently. Despite the slightly reduced viewing angles, we've been told that this smaller depth camera offers the same performance as before. That certainly seems to be the case after this author tried it using PMD's Nimble UX middleware (co-developed by 3Gear Systems), which is able to do two-hand skeletal tracking down to finger level, as shown in our video after the break.
When Parrot AR.Drone meets Myo armband, magic ensues (video)
Ah, Las Vegas, the perfect location for a spontaneous wedding. Earlier today, we witnessed a quick and dirty collaboration between Parrot and Thalmic Labs at CES, where they paired an AR.Drone 2.0 with a Myo gesture-control armband. The demonstrator was able to control the drone's tilt direction by using just one arm, as well as toggling the rotors by clicking fingers. This author gave it a go as well and found this control method to be as effortless as it looked, though due to the WiFi interference on the show floor (ugh, so typical of large events these days), the drone had to stay within a close proximity of the iPad that was relaying the Myo's commands. There wasn't a way to adjust the vehicle's height in that particular demo, but there's no stopping Thalmic Labs from assigning additional gestures for that -- maybe clench fist to ascend and spread hand to descend. Understandably, neither company could confirm whether they are bringing this feature to market, but we'd be very surprised if they don't sustain their marriage moving forward.
Thalmic Labs' Myo armband does gesture control with muscles (video)
2013 saw the rise of gesture cameras for TVs and various smart devices, but Canadian startup Thalmic Labs thinks its Myo armband is the way forward. During our meeting at CES earlier, co-founder and CEO Stephen Lake explained that his Bluetooth 4.0 device features a new type of biosensor, which can pick up minute electrical impulses in our arm muscles. This allows any wrist movement, finger twitch or fist clenching to be interpreted as a gesture, so long as the inner side of the Myo has skin contact. There's also an accelerometer, a gyroscope and a magnetometer, so arm and body movements are accounted for as well. The idea of Myo traces back to the co-founders' university days, where they explored various wearable technologies while working on a navigation aid for the blind. Lake said since brain control isn't quite there yet, his team found muscle sensing to be the next best thing. From what we saw and tried today, Thalmic Labs seems to be on the right track: We watched co-founder Aaron Grant play Call Of Duty: Ghosts using just a pair of Myos, and he was able to make his avatar run, crouch, jump, fire weapon and reload. Lake also gave a demo on music playback control and slideshow presentation on an iPad, both of which worked just fine. But it doesn't stop there; the CEO also sees opportunity in industrial robotics, space application and even gesture-based authentication. The retail version of the Myo will arrive within the first half of 2014, and not only will it be half as thick as the Myo Alphas shown today, but it'll also feature at least two core applications that will make full use of the armband. Lake said he'll be showing the final design in the next couple of months, but if you're game, you can now head over to Thalmic Labs' website to pre-order a black or white one for $149. Need more convincing? Then check out our in-depth demo video after the break. Update: We also got to see how you can fly a Parrot AR.Drone 2.0 with a Myo! Check it out.
OnTheGo Platforms is bringing gesture recognition to Google Glass apps (video)
Google Glass can hold its own when it comes to voice recognition and touch, but its current software doesn't account for gesture controls. OnTheGo Platforms, however, is looking to fix that. The folks at the Portland, Ore.-based company are baking up an SDK for developers to integrate gesture recognition in apps made for Glass and other Android-based smart glasses, such as the Vuzix M100. We went hands-on with a demo photo-snapping and gallery app to put the software through its paces. In its current form, the solution recognizes swipes from the left and right, a closed fist and an open hand. A fist aimed at Glass' camera will fire off a countdown for a snapshot or take you to the app's home, depending on the current screen. Waving a hand in either direction cycles through pictures in the gallery. This editor was tempted to swipe his hand across the camera's view quickly, but the software is tuned to pick up slower, more deliberate motions about a foot or so away. The detection was often hit or miss, but the developers say they're in the process of refining the recognition and that they've recently eliminated many false positives.
HP adds Leap Motion control to select PCs just in time for the holidays
HP's adding more Leap Motion to its PC arsenal. After refreshing its Envy 17 with inbuilt Leap Motion control, HP's now expanding that partnership to select products in its desktop and all-in-one lines. The 11 devices, which encompass HP's Envy Recline, Phoenix and TouchSmart series, in addition to its Pavilion TouchSmart line, will come packed with the Leap Motion keyboard. But that keyboard would be useless without the necessary Leap Motion software, which HP's made sure to pre-install on the devices, along with a few free apps to demo the gesture control and Airspace, Leap Motion's own app store. If a motion-controlled PC sounds like something you'd want to gift wrap and nestle under the tree, then be sure to check out HP's online shop for the full list of supported devices. Because nothing says, "Merry Christmas!" (or "Happy Holidays," or whatever) like expensive, gimmicky technology.
Leap Motion releases Free Form, an app that lets human hands sculpt digital clay (video)
When we reviewed the Leap Motion controller earlier this year, we found the application selection to be a bit lacking. Since then, the number of apps has doubled from 75 to around 150, and the Airspace store's newest edition is the coolest Leap app we've yet seen. It's called Free Form, and it's a 3D sculpting app (not unlike 3D Systems' Sculpt) built in house at Leap Motion that lets you manipulate and shape digital objects using your fingertips. David Holz, company co-founder and the man who figured out the math behind Leap Motion's technology, gave us a demo of the app and talked a bit about why Leap built it. Additionally, he showed us a new developer beta software that does 360-degree tracking built to address some of the original Leap shortcomings.
Google gesture patent would let Glass wearers 'heart' real-world objects
As it stands, Google Glass doesn't have a simple way of cataloging real-world items -- you have to snap a picture and make a note afterward. It may get much easier if Google implements a newly granted US patent, however. The technique uses a wearable display's camera to detect hand gestures made in front of objects. Make a heart shape and you'll "like" what's front of you; frame something with your fingers and you'll select it. There's no certainty that Glass will ever support these commands, but they're certainly intuitive. If nothing else, they could lead to a new, very literal take on Google Goggles.
ARM and eyeSight optimize gesture control for mobile processors
Hands-free gesture control is no longer a novelty in the mobile space, but the required processing power limits what's currently possible. More sophisticated input may be close at hand, however, as eyeSight has just teamed up with ARM to optimize its gesture control for chips using Mali-T600 series graphics. By relying on the T600's general-purpose computing engine, eyeSight's software can now track 3D movement, facial expressions and finger gestures without a huge performance hit. While companies will have to build supporting apps and devices before we see eyeSight's technology in use, it could lead to Kinect-like control of phones and smart TVs using relatively ordinary silicon.
Google applies for patent on gesture-based car controls
So far, in-car gestures aren't useful for much more than raging at the driver that just cut you off. Google wants these gestures to be more productive, however, and has applied for a patent that uses hand motion to control the car itself. Its proposed system relies on both a ceiling-mounted depth camera and a laser scanner to trigger actions based on an occupant's hand positions and movements. Swipe near the window and you'll roll it down; point to the radio and you'll turn the volume up. While there's no guarantee that we'll see the technology in a car, the USPTO is publishing the patent filing just a day after Google has acquired a motion control company. If nothing else, the concept of a Google-powered, gesture-controlled car isn't as far fetched as it used to be.
Toshiba's new dual camera module brings 'deep focus' imaging to smartphones
Remember when dual camera modules on smartphones were all the rage? Toshiba is bringing them back -- only this time with technology that you're much more likely to use. Its new module uses two 5-megapixel cameras to record depth and images at the same time, producing a "deep focus" picture where everything is sharp. The technique offers a Lytro-like ability to refocus, even after you've taken the shot; it also provides gesture control and very fast digital autofocusing. You'll have to wait a while before you're snapping deep focus vacation photos, though. Toshiba doesn't expect to mass produce the sensors until April, and finished products will likely come later.
Sony confirms PS4 gesture and voice control, HDMI capture for games (update)
Looking for more PlayStation 4 news? Here's a bit now that Sony's Tokyo Game Show keynote address is over. A couple of features we'd hoped to hear more about have been confirmed, and the first is that the PS4 camera will support both voice and gesture control. It's no surprise that the system will take full advantage of the add-on's dual cameras and four mics for people who want to wave at their TVs, but it's nice to have that confirmed. Second, in a move that will mostly benefit game reviewers and YouTube video walkthrough experts (thanks for your help on these GTA V missions, all of you), the PlayStation 4 will allow unencrypted HDMI output for games. On the PS3 it didn't, treating games the same as Blu-ray movies, so anyone capturing video in HD needed to use component cables. That's on top of the console's "Share" button that sends clips of gameplay straight to Ustream, Facebook or PSN. Check out our liveblog for everything else discussed tonight including the PS4's mobile apps, indie gaming and Vita TV. Update: Sony Worldwide Studios head Shuhei Yoshida tells us via Twitter that HDMI capture on PS4 won't be available at the console's launch this November, but "in the future."
Hisense picks up Hillcrest Labs' gesture and motion control tech for TVs
Following LG and TCL, Hisense is now the latest TV manufacturer to adopt Hillcrest Labs' Freespace technology. According to the agreement, Hisense, the world's fifth largest smart TV brand (as of Q1 2013, according to NPD DisplaySearch), will be able to add in-air pointing, gesture control and motion control -- all via a remote control -- to its future smart TVs and set-top boxes. This also means TCL now faces a fellow Chinese competitor with the same set of Freespace features. While there's no time frame just yet, we've been told that Hisense will eventually sell these next-gen devices in the US and China later this year, so stay tuned.
SoftKinetic teases embedded 3D depth camera, coming to Intel devices next year (hands-on)
At Intel's Computex keynote earlier today, the chip maker teased that it expects embedded 3D depth cameras to arrive on devices in the second half of 2014. Luckily, we got an exclusive early taste of the technology shortly after the event, courtesy of SoftKinetic. This Belgian company not only licenses its close-range gesture tracking middleware to Intel, but it also manufactures time-of-flight 3D depth cameras -- including Creative's upcoming Senz3D -- in partnership with South Korea-based Namuga. Read on to see how we coped with this futuristic piece of kit, plus we have a video ready for your amusement. %Gallery-190272%
Hands-on redux: Creative's Interactive Gesture Camera at IDF 2013 Beijing (video)
At IDF 2013 in Beijing, Intel is again making a big push for perceptual computing by way of voice recognition, gesture control, face recognition and more, and to complement its free SDK for these functions, Intel's been offering developers a Creative Interactive Gesture Camera for $149 on its website since November. For those who missed it last time, this time-of-flight depth camera is very much just a smaller cousin of Microsoft's Kinect sensor, but with the main difference being this one is designed for a closer proximity and can therefore also pick up the movement of each finger. We had a go on Creative's camera with some fun demos -- including a quick level of gesture-based Portal 2 made with Intel's SDK -- and found it to be surprisingly sensitive, but we have a feeling that it would've been more fun if the camera was paired up with a larger display. Intel said Creative will be commercially launching this kit at some point in the second half of this year, and eventually the same technology may even be embedded in monitors or laptops (remember Toshiba's laptops with Cell-based gesture control?). Until then, you can entertain yourselves with our new hands-on video after the break. %Gallery-185293%
Leap Motion Controller starts shipping May 13th, hits Best Buy on the 19th
If you were still thinking that the Leap Motion Controller was going to turn into vaporware, it looks like you were wrong. Just less than a year after the company first made waves with its tiny gesture recognizing box, a finished product is getting ready to ship. Those that pre-ordered should receive their shipping notices starting May 13th. If you weren't willing to commit to the device before hand, you'll still be able to snatch one up at Best Buy on May 19th for $80. Or, if you're so inclined, you can continue to use your mouse to play Cut the Rope on your desktop... your choice we suppose.
Leap Motion goes retail: motion controller to be sold exclusively at Best Buy
Ever since we first saw Leap Motion's hyper-accurate gesture control system in person, we've been waiting for the time when we can walk into a store and buy one. Sure, devs have been able to buy Leap controllers for some time and it won't be long before Leap's tech is baked into retail laptops, but now the general public's going to get the chance to grab the standalone controller, too. That's right, folks, this spring, the Leap Motion Controller will be available nationwide at any Best Buy store, with pre-orders starting in February. So, it won't be long before you can stroll on down to the nearest big blue box and pick one up -- assuming there's still one within strolling (or driving) distance.
Hyundai unveils HCD-14 Genesis concept: suicide doors, gesture and eye controls
At NAIAS 2013 Hyundai has given an indication of where its "premium vehicles" are headed with its HCD-14 Genesis concept. Sporting a sharp-edged style and suicide doors, the sedan gets even better inside, with a control layout that foregoes the traditional knobs and buttons. According to Hyundai (it wasn't demonstrated) it includes eye tracking and 3D hand gesture recognition accurate enough to control navigation, infotainment, audio, HVAC, and one's phone. The RWD vehicle packs a 5.0-liter Hyundai Tau V8 engine under the hood with optical recognition that verifies its driver before starting. Hyundai stated that there would be two vehicles on the way following this concept's design, with the second including even more of its advanced tech. Check out the full list in the press release after the break, as well as a good look at the car in our gallery.
A look around Haier's CES 2013 booth: HaiPads, plenty of panels and a wireless blender
Haier had a pretty formidable booth here at CES, so naturally, we had to swing by and cast our eyeballs over anything and everything there. A wall of TVs greeted us, which turned out to be the company's 2013 Roku-ready HDTVs and Android-packing smart models. Screens were everywhere, but there was also a table with some finger-friendly equipment like 9.7-, 7- and 5.3-inch HaiPads, as well as a Windows 8 laptop, touchscreen all-in-one and tab / laptop slider. The slider looked pretty nice, but all the aforementioned hardware was set up in Chinese, so we lost interest pretty quickly. A central hall booth wouldn't be the same without a 4K TV, but not to worry, Haier had a couple on display -- unfortunately, glare from all the other screens dotted around kind of dampened their impact. What we were most interested in was all the prototype technologies on show, but all the Haier reps were from the US sales department, so not a soul could talk about the demonstrations. The eye-controlled TV we saw at IFA last year was getting quite a lot of attention, while the mind-controlled set we've also seen before was almost certainly playing a looping video to give the illusion something was happening. There were also several gesture-controlled models, but one wasn't working and the other was hosting a very basic Kinect-type game. A ping-pong game played with a "Sensory Remote" was also up on one TV, but looked unresponsive and therefore, unfun. A multi-view demo using dual 3D specs did what it was supposed to, and a glasses-free 3D TV prototype showed nice depth as long as you were 12+ ft away (the camera can't really replicate the effect, but there's a quick video of it below anyway). The booth also had a household section which we thought was safe to ignore, until a "wireless blender" caught the eye. "It's just a blender with a battery in it, surely?" this editor asked. "No, there's an inductive coil built into to the underside of the counter," was the reply. Thus was our Haier experience at CES, and to revisit it through our eyes lens, check out the gallery below. Kevin Wong contributed to this report.
ASUS partners up with Leap Motion, PCs with 3D motion control to debut in 2013
Leap Motion's been working hard to get its 3D gesture control technology in the hands of developers -- 12,000 thus far -- since it was first revealed back in May of 2012. Today marks a big step towards getting it into the hands of consumers, as the company has announced its first OEM partner, ASUS. The Taiwanese firm plans to put the technology into new high-end notebooks and premium All-in-One PCs packing Intel's Haswell silicon. As a quick refresher for those unfamiliar with Leap, its tech has a 150-degree field of view that tracks individual hands and all 10 fingers at 290 frames per second to provide ultra precise motion controls. This news got you itchin' to ditch that old machine with its archaic touchpad or mouse interface in favor of a gesture-controlled ASUS? We can't tell you how much they'll cost, but ASUS promises the PCs will be available around the world later this year.
Hisense jumps into 4K TVs with the XT880, promises Android 4.0 and a sane size
The current crop of 4K TVs from LG and Sony are large enough that some of us would need to knock out a wall to get them inside. Enter a surprise early challenger from Hisense: its upcoming XT880 line's 50-, 58- and 65-inch sizes deliver that 3,840 x 2,160 picture at dimensions built for mere mortal living rooms. We're also promised a full-fledged, 3D-capable smart TV based around Android 4.0 with WiFi internet access, a remote with voice commands and a removable camera for gesture control or Skype chats. Hisense yet hasn't committed to launch details for the XT880 line besides a presence on the CES show floor; however, it's safe to say that the smaller sizes will bring the price of Ultra HD down from the stratosphere.