LiDAR Combined with Voice Assistant for Novel Interface Device

Consider smart home assistants. While useful, their interface methods generally consist of responding to voice commands, and some implement a small touchscreen. Normally, there’s no awareness of the surrounding area to tell where, for instance, you’re standing, or if there is an object on the table next to it.

SurfaceSight provides smart assistants with increased contextual awareness. (📷: Gierad Laput)

SurfaceSight — a new project from Gierad Laput and Chris Harrison from Carnegie Mellon University’s Future Interfaces Group — takes a (literal) new spin on things, placing a rotating LiDAR unit underneath an Amazon Echo. This technology, most well known for its use in autonomous vehicles, is able to detect when something is in front of the sensor, and can group readings into contiguous objects. With this data, it uses machine learning to classify the type of item sensed, and can even track hand movements and respond to gestures.

SurfaceSight can also track people and estimate which way they are facing. (📷: Gierad Laput)

The video below shows off some truly impressive abilities, as it’s capable to discriminate between different types of similarly sized objects. It can even roughly sense in which direction a human is facing and modify the way it interacts in response. Possible applications for such tech could include a smart wall or desk surface, or perhaps functionality to tell you where you left your keys or sunglasses!

https://medium.com/media/99efb05232f2a35abefa8cb7d673fc42/href


LiDAR Combined with Voice Assistant for Novel Interface Device was originally published in Hackster Blog on Medium, where people are continuing the conversation by highlighting and responding to this story.

Source: LiDAR Combined with Voice Assistant for Novel Interface Device

Laisser un commentaire

Votre adresse de messagerie ne sera pas publiée. Les champs obligatoires sont indiqués avec *

Ce site utilise Akismet pour réduire les indésirables. En savoir plus sur comment les données de vos commentaires sont utilisées.