A new revolutionary technology is in the pipeline. Work is underway in Google labs to develop screens that can be manipulated by mid-air gestures. Gone are the days of typing errors, swiping mistakes, and screens stained by grubby fingers. The project that’s going to change it all? Project Soli.
Google, whose parent company, Alphabet, recently became the worlds most valuable company, is developing a radar-based technology that allows for the mid-air manipulation of ‘no-touch’ screens. “We want to break the tension between the ever-shrinking screen sizes used in wearables, as well as other digital devices, and our ability to interact with them,” a message on the companies website said.
The project website provides a video that shows the various methods of gestures being tested. Swiping, twisting, sliding, clicking, grabbing, the range of gestures are endless, as are the range of apps and technologies that could emerge as a result of it’s use. This could signal the end of remote controls, light switches, phones, dials and buttons to name but a few.
“The reason why we’re able to interpret so much from this one radar signal is because of the full gesture recognition pipeline that we’ve built. The various stages of this pipeline are designed to extract specific gesture information from this one radar signal that we receive at a high frame rate.” said Project Soli’s Lead Research Engineer, Jaime Lien.
While the release date of this technology is still unknown, Google are said to be inviting a small number of developers to try out the technology. Check out the video here.