Elliptic Labs

Latest

  • Elliptic Labs releases ultrasound gesturing SDK for Android, will soon integrate into smartphones

    by 
    Darren Murph
    Darren Murph
    10.01.2013

    Elliptic Labs has already spruced up a number of tablets by adding the ability to gesture instead of make contact with a touchpanel, and starting this week, it'll bring a similar source of wizardry to Android. The 20-member team is demoing a prototype here at CEATEC in Japan, showcasing the benefits of its ultrasound gesturing technology over the conventional camera-based magic that already ships in smartphones far and wide. In a nutshell, you need one or two inexpensive (under $1 a pop) chips from Murata baked into the phone; from there, Elliptic Labs' software handles the rest. It allows users to gesture in various directions with multiple hands without having to keep their hands in front of the camera... or atop the phone at all, actually. (To be clear, that box around the phone is only there for the demo; consumer-friendly versions will have the hardware bolted right onto the PCB within.) The goal here is to make it easy for consumers to flip through slideshows and craft a new high score in Fruit Ninja without having to grease up their display. Company representatives told us that existing prototypes were already operating at sub-100ms latency, and for a bit of perspective, most touchscreens can only claim ~120ms response times. It's hoping to get its tech integrated into future phones from the major Android players (you can bet that Samsung, LG, HTC and the whole lot have at least heard the pitch), and while it won't ever be added to existing phones, devs with games that could benefit from a newfangled kind of gesturing can look for an Android SDK to land in the very near future. Mat Smith contributed to this report.

  • Elliptic Labs demonstrates its touchless user interface for iPad (with video)

    by 
    Mike Schramm
    Mike Schramm
    01.09.2011

    As promised a while back, we got to chat with Elliptic Labs here at CES, and CEO Stian Aldrin walked us through the touchless gesture technology his 15-person Norway-based company is developing as a prototype. The whole thing is based on ultrasound, it turns out -- a small speaker kicks out frequencies higher than the ear can hear, and a set of microphones listens in on the reflections, using an algorithim to calculate where your hand is at as you wave it through the air. The result is a gesture-based control system for touchscreen devices, but without the actual touch. Aldrin told us that the system is already in use in a Norwegian hospital, where surgeons control touchscreen tablets without having to take their sanitized gloves off during surgery. Currently, the system only allows for a few simple gestures (swiping up and down, or left and right), but that's just a limitation of the demo units Elliptic Labs has created. Potentially, such a system could not only recognize the placement and speed of your hand passing by (and indeed, one of the demos in the CES booth could monitor both proximity to the screen and speed, flipping on-screen content faster if you pushed your hand by faster), but it could also calculate multiple points of movement, doing things like multi-touch gestures in the air. You do have to be pretty close in to the screen to operate the device -- rather than a big cone like a Kinect, the system monitors a sphere around itself, so you've got to have your hand enter that sphere for it to register. But Elliptic (who already plan to be back at CES with an even bigger booth next year) suggests that the system could be used for lots of things, from quick music controls to car controls, or anything else where you need to make a touch-style gesture without actually touching the screen. We've got exclusive video after the break of Aldrin demoing a dock version of the system, connected via Wi-Fi to an off-the-shelf iPad running a custom-made app.

  • Elliptic Labs to show off gesture-sensing iPad dock at CES 2011

    by 
    Mike Schramm
    Mike Schramm
    12.22.2010

    Elliptic Labs has been working on gesture-sensing technology for a while now (where you can just swipe your hand in the air instead of actually touching a screen), and rumor has it that the company will be showing off a dock at next month's CES show for the iPad -- something that enables you to control Apple's magical and revolutionary device without actually touching it. The main use is apparently in the kitchen (where your hands might be messy from cooking, keeping you from wanting to grease up that screen like a pie plate), but I can see this functionality in a kiosk somewhere, or any system where you wouldn't want people actually laying hands on a device. There's a quick video after the break featuring one of the company's other devices, but presumably the same gestures would be used to control the iPad. It'll be interesting to see, too, just what kind of functionality the controller can offer. Swiping between screens wouldn't be hard, but I'd like to know if it offers any more granular control as well. Fortunately, TUAW will be live at CES, so we'll make sure to stop by Elliptic's booth and give it a try to let you know what it's like.

  • Elliptic Labs set to save your iPad from smudges with 3D gesture-sensing dock (video)

    by 
    Tim Stevens
    Tim Stevens
    12.22.2010

    The dream of kitchen computing still isn't here, with many chefs forced to read from archaic paper-based recipe lists or, worse yet, memorize the things. Maybe all we need is a way to interact with our gadgets without getting them all messy, and maybe Elliptic Labs can get us there. Finally. The company has been teasing us with its 3D gesture interface for years now and it looks set to finally show off an actual product, a motion-sensing iPad dock prototype making its debut at CES in a few weeks. The idea is you perch this sucker in your kitchen and it gives you full control whether you're kneading sourdough or mixing meatballs, keeping your tablet streak-free -- and hygienic. That seems like somewhat limited usefulness to us, but check out the video of an earlier prototype below and see if it doesn't make you want to bake some cookies. And, if it does, feel free to bring us some.

  • Elliptic Labs returns with more red hot touchless UI action

    by 
    Joseph L. Flatley
    Joseph L. Flatley
    09.08.2009

    Elliptic Labs is back on the scene with another demo of its touchless UI. This time 'round the company's teamed up with Opera and presents us with a much more polished affair, not to mention a couple technical details. According to CEO Stian Aldrin, the device is based on ultrasound, tracks the hand itself (no reflector or sensor necessary), has a range of one foot, and has been designed to be either embedded in any electronic device (including a cellphone) or to connect to devices via USB. The company's current demo shows the technology being used to flip through photos in an Opera widget. Sure, a couple simple one-gesture commands isn't exactly "pulling all the stops," as far as a proof-of-concept goes, but we're looking forward to seeing what this company comes up with in the future. Peep for yourself after the break.

  • Elliptic Labs shows off touchless interface for 3D navigation

    by 
    Darren Murph
    Darren Murph
    02.03.2008

    We've seen (though not felt, for obvious reasons) a few touchless interfaces before, but the latest development coming from Elliptic Labs is a real treat. Dubbed a "touchless human / machine user interface for 3D navigation," the firm has somehow figured out how to allow mere mortals to manipulate on-screen images without requiring any sort of funky gloves to be worn or a microchip to be installed in your fingertip. Sadly, little is said about the actual technology behind the magic, but don't worry, the video waiting after the jump should provide plenty of satisfaction.[Via Technabob]