Project Soli by Google is going to be a new benchmark in the sensing technology. It is based on the concept of Virtual Tools, which are used to imitate bodily gestures. It is just going to be like imaging an invisible button between the thumb and index finger, and doing variable hand movements to perform different tasks.

What can be expected from Project Soli?

Soli chip will possess antenna array and whole sensor within 8mm x 10mm package.

It is still in the making as tons of researches and executions are required to make it perfect but once it is ready, it will use miniature radar to track distant hand gestures of a user and perform actions accordingly. The sensor is subjected to track sub-millimeter motion at high speeds, without hampering its accuracy. Alongside, a Ubiquitous Gesture Interaction Language is being designed, using which a set of universal gestures will be created to operate Soli.

 

How does it Work?

Soli sensor works by emitting electromagnetic waves in a broad beam. When there are objects within these emitted waves, the beam gets scattered and some portion of the waves is reflected back to the radar antenna. Frequency Shift, Time Delay and Energy of the reflected waves give information about different characteristics of the object such as its velocity, material, shape, distance, orientation, size and dynamics.

Like other radar sensors, Soli doesn’t require high spatial resolution and large bandwidth because it is programmed using novel radar sensing paradigm and especially designed algorithms, hardware and software.

Potential Applications of Soli

As Soli does not have any movable parts, consumes less energy and is so tiny to fit into a chip, it can be embedded in several IoT devices, phones, computers and other wearable items. Imagine how wonderful it will be. We will just be able to operate our devices by performing some hand gestures and there won’t be any requirement to hassle of any sort!