Sonar-enabled glasses read wearers' silently spoken voice commands

Published April 6, 2023

Print

A New Atlas article about Cornell University’s experimental eyewear that can read wearers’ silently spoken voice commands mentions the EarCommand system developed in the lab of Zhanpeng Jin, associate professor in the Department of Computer Science and Engineering. 

The EarCommand system reads silently spoken words via an earbud that detects distinctive ear canal deformations produced by specific mouth movements.

Read the story here.