We've been following Elliptic Labs' development on ultrasound gesture control for quite a while, but no time frame was ever given until now. Ahead of CEATEC in Tokyo, the company finally announced that its input technology -- developed in partnership with Murata -- will be arriving on phones in the first half of 2015. But that's not the only good news: On top of the usual swiping gestures for images, games and navigation (we saw some of this last year), there's now a new capability called "multi layer interaction," which uses your hand's proximity to toggle different actions or layers. It's potentially useful for glancing at different types of messages on the lock screen, as demoed in the video after the break.
Compared to its optical counterparts, this ultrasound solution is more convenient for everyday use, as it has a 180-degree active area around the entire face of the device. The others need your hand to be positioned in front of a camera or a dot sensor, which can be easily missed if you're not waving carefully; though in their defence, the laser-based gesture cameras capture more detail, which is useful for other applications like 3D scanning plus precise point-and-click. At the end of the day, it's all about who can perfect the basic user experience, so stay tuned as we hit the show floor tomorrow to see if this is as good as it claims to be.
Update: As promised, we got some hands-on time with Elliptic Labs' latest development kit at CEATEC. While we only saw one demo utilizing multi layer interaction, it worked just as shown in the company's promotional video: We toggled between three layers of interface on the lock screen by simply moving our hand towards and away from the screen. Elliptic Labs CEO Laila Danielsen added that it's possible to integrate this technology into car dashboards, wearables and healthcare equipment, but for now, her team is focusing on smartphones, tablets and laptops. As to which manufacturer will be using it first, Danielsen said we'll have to wait and see.