For those who lose or lack the ability to speak, communication can be slow and painstaking. For example, towards the end of his life famed cosmologist Stephen Hawking communicated solely through the movement of a single cheek muscle as a result of his motor neuron disease (ASL). With the aim of finding a solution to this problem, a team at University of California have coupled neural implants with artificial intelligence to translate brain activity into spoken sentences.

The team, led by Professor Edward Chang, used these implants to record the brain activity of five patients as they read simple sentences aloud. The implant consists of a small sheet of electrodes that are placed directly on the surface of the brain, where they record the electrical activity. The team then used this recorded activity to predict the vocal instructions that were being sent to the lips, tongue, jaw and throat, combinations of which generate individual sounds. Finally, they used these predicted oral movements to create computer-generated speech (examples of the speech can be found here). Incredibly, they found that listeners were able to understand roughly seventy percent of the synthesized words when asked to choose from 25 options per word.

An important caveat is that this approach relies on being able to map brain activity to the execution of mouth movements. As such, it is unclear how feasible this method would be for individuals who have not only lost the ability to physically generate sound, but are also beyond the point of evening thinking about speaking – meaning that this method is still not viable for those who need it most. Nevertheless, these findings serve as an exciting proof-of-principle for brain-machine-interface technologies and brain-wave-based therapies for neurological disorders.

Managing Correspondent: Rory Maizels

Original article: Speech Synthesis from Neural Decoding of Spoken Sentences – Nature

Media coverage: New Device Translates Brain Activity into Speech. Here’s How. – National Geographic

Image Credit: UCSF

2 thoughts on “Read My Mind: An Implant That Translates Brain Activity into Speech

  1. Is there any method to know what someone thinking without any machines? I mean wireless machines?

Leave a Reply

Your email address will not be published. Required fields are marked *