gesture-based

Latest

  • Myo replaces controllers with arm-controlled Oculus Rift gaming

    by 
    Thomas Schulenberg
    Thomas Schulenberg
    03.15.2014

    The Oculus Rift headset is doing its best to draw players into the virtual realities it hosts, but traditional or motion-assisted controllers are typically used to interact with these virtual worlds. Thelmic Labs has a different vision for player interaction however - rather than tracking the positioning of a controller in 3D space, Thelmic's Myo armbands build virtual representations of a player's hands and forearms. According to Thelmic's FAQ page, Myo works by measuring the "electrical activity from your muscles to detect what gesture your hand is making." The armbands use a Bluetooth 4.0 connection to communicate with Windows, Mac, Android or iOS devices. Thelmic lists the beginning shipments of Myo development kits for the "first half of 2014," with the consumer version shipping shortly after that. Both are priced at $149. Potential developers that don't want to wait around for Thelmic's wider distribution shipments can apply to join the Thelmic Alpha Developers, a group that will gain access to pre-production Myo hardware. While a cool concept doesn't promise future developer support for the hardware, it's pretty neat to imagine casting in-game spells with nothing but hand gestures. Besides, whatever amount of gesture functionality the Myo achieves can't possibly dip below the bar that Steel Battalion: Heavy Armor set on the Kinect, right? [Image: Thelmic Labs]

  • Daily Update for May 21, 2012

    by 
    Steve Sande
    Steve Sande
    05.21.2012

    It's the TUAW Daily Update, your source for Apple news in a convenient audio format. You'll get all the top Apple stories of the day in three to five minutes for a quick review of what's happening in the Apple world. You can listen to today's Apple stories by clicking the inline player (requires Flash) or the non-Flash link below. To subscribe to the podcast for daily listening through iTunes, click here. No Flash? Click here to listen. Subscribe via RSS

  • Leap suggests future of gesture-based computing

    by 
    Steve Sande
    Steve Sande
    05.21.2012

    When techies try to think of the future of gesture-based computing, they often discuss 2002's Minority Report, a sci-fi thriller starring Tom Cruise. In the flick, Cruise controlled a huge transparent display by moving his hands and arms like an orchestra conductor. Now San Francisco-based Leap is taking gesture control seriously, with a US$69.99 product that's expected to ship later this year. The Leap is a small aluminum and black plastic device that looks like it fell off of an iMac. Plug the Leap into a USB port on your Mac, load special software (Leap Motion), and then wave your arm to calibrate the device. Now you have control of about 8 cubic feet of space, with each motion of your hands or fingers precisely tracked to within 1/100th of a millimeter. Leap is looking to developers to create software to truly take advantage of the precise control provided by the device. As such, they've created a developer kit that's available to registered developers for free, including the SDK and a Leap. Examples of apps are shown in the video below, but it's apparent that the company has resolved one of the issues of moving gesture-based computing to the "big screen" -- getting rid of all of those fingerprints on your screen by making sure your fingers never touch the screen. The company says it is working with "many of the world's largest companies," so there's a hope that we'll see this technology built into future Apple products soon. [via CNET]

  • Hands-on with wireless, ultrasonic stylus and touchless gestures at MWC (video)

    by 
    Zachary Lutz
    Zachary Lutz
    03.01.2012

    This isn't the first time you've heard of EPOS or XTR, but it's been quite some time since we've checked in with either of the outfits. So, imagine our surprise as we stumbled on new developments from each company as we perused the Texas Instruments booth at MWC. In the case of EPOS, we're shown a stylus that, in addition to offering traditional physical touch input, also allows users to interact with a device via ultrasound. The system is built upon TI's OMAP4 platform and requires that four microphones be placed at the corners of the screen. In this demonstration, we're shown how users can manipulate objects on a 3D plane via the Z-axis by pulling the pen away from the display. Next, we're shown a new application for the touchless gesturing system that XTR first debuted back in 2010. In this scenario, it's demonstrated how tablet owners could use the front-facing camera (at merely QVGA resolution) to flip through pages of a cookbook without worry of getting ingredients on the device. The concept software was developed by a French outfit known as Stonetrip, and also allows users to zoom and pan through the pages. You'll find demonstrations of each technology in a video after the break.

  • Microsoft gesture patents reveal possible dual-screen tablet focus

    by 
    Joseph Volpe
    Joseph Volpe
    08.26.2011

    Earlier this year, we heard a slew of whispers promising a late winter launch for Microsoft's Windows 8 tablets -- there was even mention of a tab-specific OS. While it's become clear that Redmond intends for its new operating system to run on multiple devices, insight into its tablet plans are still somewhat shrouded in mystery. From the looks of these gesture patents, however, it's clear MS has been eyeing a dual-screen tablet future (hello Courier). The recently published patents, not yet granted to the company, touch upon methods for off-screen input (read: bezel-based), but mainly focus on these multi-screen input options: dual tap, pinch and expand, hold and page-flip, hold and tap, and finally, bookmark hold. As you can glimpse from the image above, the patents handle simultaneous touch screen input, allowing a user the ability to swap images between screens, or even freeze a page on one side while continuing to browse on the other. By the looks of things, Sony might have some fierce competition in the multi-screen tablet market. Head to the source below to get a fuller look at these touch-based solutions.

  • EyeSight brings its gesture controls to Android tablets, Windows-based devices

    by 
    Donald Melanson
    Donald Melanson
    02.03.2011

    EyeSight has been bringing its hand-waving UI to all sorts of mobile devices for some time now, and it's now expanded things yet again. Following up its launch on Android last summer, the company has announced that its gesture recognition software has now also been tailored specifically for Android tablets and other "computer-based" Android platforms, and it's announced that it's now available for Windows-based devices as well. As before, the software is able to work with just about any built-in camera, and the company says that it has been "highly optimized" for mobile platforms, with low CPU and memory requirements. It's not something available directly to users, though -- it's up to developers to license it and include the functionality in their applications. Head on past the break for an idea of how it works -- just try to ignore that conspicuously out of place iPad at the beginning of the video.

  • TI and XTR team up on touchless gesturing system for mobile devices

    by 
    Donald Melanson
    Donald Melanson
    09.15.2010

    We've seen a few examples of touchless, gesture-based interfaces for mobile devices, but it looks like Texas Instruments might be closer than most to making it a reality -- it's just announced a partnership with Extreme Reality (also known as XTR) on a new gesture engine and framework specifically designed for its OMAP 4 platform. The two companies actually showed off such a system back at MWC earlier this year (check out a demo of ti after the break), but they've only just now made the partnership official, and they're promising plenty more advancements to come -- including the ability to not only recognize simple gestures, but even things like whole body movements and two-handed gestures. Head on past the break for the complete press release.

  • Multitouch DJ table lets you swipe to rock

    by 
    Donald Melanson
    Donald Melanson
    08.09.2010

    We just recently got a glimpse of one possible future of DJing, but our world has now already been turned upside down once again with this multitouch-enabled rig built by Gregory Kaufman. The big difference with this one, as you can probably guess, is that it employs a gesture-based interface that lets you spin the virtual turntables and use a variety of taps and finger swipes to replicate the main functions of a regular DJ deck. What's more, Kaufman says that the only gear a DJ would have to carry is a USB drive with their own music and settings, which they'd simply plug into the multitouch table at a club -- assuming the idea catches on, that is. Top top things off, the system would also be able to accommodate regular DJ gear for some added flexibility, and even provide enough room for two DJs if you're looking to battle or share the stage. Head on past the break to check it out in action.

  • German student shows off camera-based input on an iPhone

    by 
    Donald Melanson
    Donald Melanson
    03.12.2010

    Using a camera as an input device is hardly a new idea -- even on a mobile device -- but most examples so far have been to enable functionality not possible on a touchscreen. As Master's student Daniel Bierwirth has shown in the video after the break, however, a phone on a camera can also be used as an alternative input method for features like scrolling or zooming, potentially allowing for easier interaction on devices with smaller screens. Bierwirth also takes the idea one step further, and sees the system eventually including a second camera that's worn by a person, which would be able to detect when your hands are near the phone and allow for a range of other gestures. Check out his full report at the link below.

  • Gesture-based television control developed

    by 
    Joshua Topolsky
    Joshua Topolsky
    07.15.2007

    Seemingly taking a step backward for even the terminally lazy, a team of scientists have unveiled a gesture-based control system for your television which uses only its own system of hand signals. Similar to a previous concept developed by MIT, the system works by monitoring the "movements" of a slothful couch surfer, and then reacts to a set of seven hand motions such as clenching your fist ("start"), thumbs-up ("up"), and a sideways peace-sign ("channel"). The researchers say the software can also distinguish between actual "TV gestures" and the movement of pets or small children. In related news, a similar device is also in development which allows its user to control almost all features of your television and associated equipment using a single thumb, although those involved in development are unsure they'll find a market for this "remote," as it were.[Via The Raw Feed]

  • First gesture-based XBLA title still in the dark

    by 
    Ken Weeks
    Ken Weeks
    05.10.2006

    I was pretty eager to get my hands, or rather my flapping arms, on Totem Balls, the first  Xbox Live Arcade title to make use of the 360 camera for gesture-based gameplay. Totem Balls has a tropical island theme. You control a little a totem pole-looking character by moving your arms up and down at your sides (a motion that resembles the funky chicken) as he collect balls that balance on his head. Unfortunately, it turns out the still-unpriced Xbox 360 camera has the same studio-like light requirements as the Sony EyeToy, making the game basically unplayable in the dim, romantic glow of the Xbox Live Arcade booth -- much to the chagrin of the Microsoft staff on hand. Despite keeping the gaming press in the dark until the emergency floodlights arrive, they claim it's a superior product.