Skip to main content

Scientists Decode Brain Waves to Understand What We Hear

According to their research, hearing and imagining sounds activate similar areas in the brain. "If you can understand the relationship well enough between the brain recordings and sound, you could either synthesize the actual sound a person is thinking, or just write out the words with a type of interface device," said Brian Pasley, a post-doctoral researcher at UC Berkeley.

The research may lead to a technology that would enable us to listen to patients, who are unable to speak due to stroke or paralysis. "If you could eventually reconstruct imagined conversations from brain activity, thousands of people could benefit," said Robert Knight, a UC Berkeley professor of psychology and neuroscience.

The findings are based on the participation of 15 people who underwent brain surgery to determine the location of intractable seizures via electrodes that are placed on the brain surface or cortex. 256 electrodes were used in this case and leveraged by the research group to measure brain activity in a conversation that lasted 5 to 10 minutes. According to the scientists, "the brain breaks down sound into its component acoustic frequencies – for example, between a low of about 1 Hertz (cycles per second) to a high of about 8,000 Hertz –that are important for speech sounds."

Computational models were used to match actually spoken words to pattern activity, which enabled Pasley to predict words the patients heard. Pasley did not reveal the success rate of his predictions.