Public Access

Community storytelling.

This post was created by a member of the Public Access community. It has not been edited for accuracy or truthfulness and does not reflect the opinions of Engadget or its editors.

Editor's Picks

Image credit:

Touchless technology that blows your mind: Project Soli

Nasrullah Patel, @patelnasrullah
08.22.16
37 Shares
Share
Tweet
Share
Save

Smartwatch is around us since past couple of years now. Few of the best in a business like Samsung, FitBit, Apple, and plenty of other brands comes with their set of distinctive features for that tiny thing that sits on your wrist. On the other hand, there are mobile applications that control the functions of wearable devices. In that case, it nothing short of a challenge for mobile app developers to suffice wearables with much needed mobile apps.

One of the amazing concepts that work on the fundamentals of haptic technology, Project Soli was showcased during last month's Google I/O event. Project Soli successfully stood out against all the proposed innovations at the event.

Google's Project Soli is aimed towards removing the need for physical human interaction while communicating with the devices. However, One must be clear about the fact that Soli is not a concept limited to only smartwatch (as tested for now), but it can be used in plenty of other devices as well.

One of the biggest shortcomings with the smartwatches available in the market is that it limits the navigation and control of wearers. Even UX designers have known the fact that interacting with the watch-dial with your finger can cover a significant portion of the dial display. It has many reasons like touchscreens being tiny, interactive buttons are too close to each other, and all these often results in a wrong input. The ergonomics are such that dial can only offer limited control. The problems are encountered every time a user attempts to interact with the dial, making it an arduous process for to interact with the watch.

Let us write off this technology as an innovation of past because in the recent Google I/O event, it came up with a more innovative smartwatch than ever before. The latest edition of smartwatches is expected to incorporate Project Soli's innovative chip. The unique thing about Soli is that it utilizes the gesture-recognition technology like the ones that are used in games like Microsoft Kinect or say Leap Motion.

Project Soli is the result of Google's ATAP division's (Advanced Technology And Projects) perseverance to present one of its smartest technology this year. Although, Soli was initially revealed at the event of Google I/O 2015; there are important design and application changes which we have seen this year and shall be discussed in detail over here.

Ivan Poupyrev, Google ATAP Project Lead points out that Soli's strength lies in letting users do anything merely with the gentle finger gestures and movements. The core idea lies in considering hand as an ultimate input device.

Google's ATAP division has a significant role to play in making the project soli possible. Firstly, it was responsible for shipping the developer kits to these the developers. Kits were meant to build the object recognition devices, wearables, and plenty other including gesture-based Bluetooth speakers.
Even though Project Soli is expected to revolutionize the smartwatch market; we can certainly expect its significant contribution in wearable IoT as well.These development kits will enable developers to design their applications for gesture controller.

Talking of design changes, redesigning the new chip led to reducing the power consumption by a significant amount, almost 22 times than before at it now sits at just 0.054 Watts. The initial iteration was shipped to in a development kit that fetched the power of 1.2w

The previously-built radar consumed much of computational power. While the latest release has improved radar's efficiency as it is now 256 times more efficient, running at 18,000 FPS (frames per second). Also, the deployed sensor is equipped with the ability to detect movement within the impressive range of 15 meters. And once you get in close enough to the range, you can start doing fine grained controls.

Radar has few of the unique properties that match to that of the camera. For example, it possesses very high positional accuracy that can sense even the tiniest of motions. Radar deployment is common in many different things like tracking cars, big objects, planes, and even satellites. Project Soli's smart watch utilizes the same to track the micro-motions that even includes small twitches of the human hand. Google incorporated this innovative technology by joining hands with the South Korean company, LG. ATAP team of developers were successful in shrinking down the chip size to deploy the same inside a customized LG's smartwatch, Urbane

Although the smartwatch is just a prototype, the gesture-enabled sensor is capable enough to work without a computer or any other human interactive interface devices as well.


The incorporated mechanism of radar hardware has a functional innovation which can be used in devices like speakers as well. Because of its ability to transmit radio wave towards a target whereby the receiver of radar's competency can intercept the reflected energy from that target, this mechanism widens its usability scope.

There is a full gesture pipeline that comes deployed inside the device. It is the reason why the multiple inputs can be efficiently decoded while interacting with any Soli-enabled device.
There are different stages in the interpretation pipeline that are designed to extract the detailed information from a single detector at a high frame rate. Vertex data is prepared inside the openGL-enabled micro-radar when the device receives input from users. The same is rendered for the Vertex Processing. It is from these foreign-ranged Doppler signals the interactive device interprets the human intent.

Now there is also an actual haptic feedback that occurs when a hand performs such particular actions. Here the hand can embody a virtual tool while it even acts on that device at the same time.The feedback is recognized in the chip as a mere physical act. It then gives direction to the users for smooth interaction with the device.

The best part about Google's Project Soli is that it has successfully pushed the processing power of the electronics further out to do the remote sensing tasks. Integrating the radar enables to work through materials and one can also embed it into objects.


The robustness is also present because radar is reliable, it does not break, nor it has any moving parts and no lenses. In layman's words, it is nothing but a piece of sand on board.

The ramifications of Project Soli incorporate precise controls on gadgets that usually have little touchscreens. One year from now, ATAP is planning to come up with another Beta DevKit that permits designers to execute Soli in various gadgets.
We can anticipate the official announcement for the public launch of Soli-driven smartwatches anytime this year. With these partnerships, Google is putting all efforts to launch this innovation in the commercial market in the form a usable consumer product.

But we all know the story of Google Glass and how it failed to stand by the expectations. Glass did not work out well as a consumer product for a variety of reasons, but there is much potential for wearable devices in the consumer market. It is reported that ATAP team has strengthened the team of engineers to work with and have started to get engineers intrigued to help the organization leap forward some the current limitations.


In spite of the fact that Google ATAP team has demonstrated first genuine working items for the current year, Project Soli is yet to transverse to be adaptable to other commercially marketable devices. It all depends on the level of interest from hardware manufacturers to chart out the future road for this radar-based gesture technology.

There's still far to go before Soli is up in the market for sale. Nevertheless, considering the advancement it made from the previous versions, something can be expected pretty soon. Also, with the advent of accessibility to beta dev kit scheduled one year from now is projected to aid progress at a relatively faster rate.
ear iconeye icontext filevr