Facebook Closer to Augmented Reality Glasses With Brain Implant That Decodes Dialogue From Neural Activity

The project is the first to decode question-and-answer speech from brain signals in real time

In 2017, Facebook’s Mark Chevillet gave himself two years to prove whether it was feasible to build a non-invasive technology able to read out 100 words per minute from brain activity.

It’s been two years, and the verdict is in: “The promise is there,” Chevillet told IEEE Spectrum. “We do think it will be possible.”

As research director of Facebook Reality Labs’ brain-computer interface program, Chevillet plans to push ahead with the project—and the company’s ultimate goal to develop augmented reality (AR) glasses that can be controlled without having to speak aloud.

Chevillet’s optimism is fueled in large part by a first in the field of brain-computer interfaces that hit the presses this morning: In the journal Nature Communications, a team at the University of California, San Francisco, funded by Facebook Reality Labs, has built a brain-computer interface that accurately decodes dialogue—words and phrases both heard and spoken by the person wearing the device—from brain signals in real time.

Advertisements

Leave a Reply