This Strange Headset Lets You Interact with Digital Devices Simply By Reading Your Mind

Researchers at MIT have developed a headset that can read the words you're picturing in your mind so that you can interact with digital devices, virtual assistants—like Siri and Alexa—and other people, without actually speaking.

When you think about saying a word, your brain sends signals to your face muscles to prepare them for the upcoming vocalization. The device works by reading these so-called sub-vocalizations, otherwise known as "silent speech."

Electrodes in the headset track these neuromuscular signals in the jaw and face. They are then deciphered by a machine-learning system—which has been taught to associate certain signals with certain words—and sent to a connected device as a set of instructions.

The MIT researchers presented a paper describing the device, known as AlterEgo, at the Association for Computing Machinery's ACM Intelligent User Interface 2018 conference in Japan.

The motivation for creating AlterEgo was to develop an "intelligence-augmentation device," according to Arnav Kapur, project lead researcher from the MIT Media Lab.

Arnav Kapur, a researcher at the MIT Media Lab, demonstrates the AlterEgo device. Lorrie Lejeune/MIT

"The goals of AlterEgo are to cognitively augment humans, change the way people communicate with one another, and enable a discreet gateway to digital information [services and applications] where the interaction is intrinsic rather than something extrinsic," Kapur told Newsweek.

"Our current interfaces are a barrier to effortless and private human-machine communication. People either have to shift their attention away from their surroundings to type, or they have to say their private messages out loud, in public," he said. "AlterEgo overcomes these barriers by allowing users to silently and seamlessly interface with a computer without the need for explicit actions. It enables a method of human-computer interaction without obstructing the user's usual perception."

The device also includes bone-conduction headphones that transmit sound through the skull, rather than through the ears, enabling the wearer to receive audio messages from a connected device, while still being able to hear the world around them.

Getting the device to understand speech involves teaching the machine-learning system to associate certain neuromuscular signals with certain words. So the team asked volunteers to wear the device while carrying out simple tasks that only required a limited vocabulary of around 20 words. For example, in one task, a user was asked to report chess moves by issuing subvocal commands.

At present, the prototype is still in its early stages, and so its comprehension is still limited. However, the more training it undergoes, the more advanced the system will become. The researchers hope that it will soon be able to understand full conversations.

Once this is achieved, the device could potentially be used by people to communicate with each other without speaking, using only the power of thought. Aside from consumer products, this could make the technology useful for communication in high-noise environments or situations where being silent is necessary. It may also be able to enable those who are unable to speak, for various medical reasons, to communicate.

"The platform opens up a wide range of possibilities," Kapur added. "What is key is that the user does not have to disconnect from her surroundings to use computer services."

"This platform allows a human user to access the knowledge of the web in real-time as an extension of the user's self; a user could internally vocalize a Google query and get a resultant answer through bone conduction without any observable action at all. The system has implications for telecommunications, where people could communicate with the ease and bandwidth of vocal speech with the addition of the fidelity and privacy that silent speech provides."

Kapur also said an eventual goal of AlterEgo is to combine humans and artificial intelligence in some capacity.

"Could we imagine a future where humans and AI collaborate on tasks, and where Artificial Intelligence would help improve our own cognition through such an interface? We imagine a future possible scenario where doctors, for example, might internally and silently consult with a clinical decision making AI agent through AlterEgo in order to improve provision of medical care."