When you hear someone else speak, specific neurons in your brain fire. Brian Pasley and a bunch of his colleagues discovered this at the University of California, Berkeley. And not only that, but those neurons all appeared to be tuned to specific sound frequencies. So, Pasley had a thought: "If you're reading text in a newspaper or a book, you hear a voice in your own head," so why can't we decode that internal voice simply by monitoring brain activity. It's similar to the idea that led to the creation of BrainPort, which lets you "see" with your tongue. Your eyes, ears or vocal chords don't really do the heavy lifting, it's your brain. And if you can give the brain another source of input or output you might be able to train it to approximate a lost ability like speech.
Building the thought decoder began by developing an algorithm tailored to each individual subject. The participant was asked to read a passage, for instance John F. Kennedy's inaugural address, aloud to get a base reading. Then they were asked to read it to themselves. And finally, to just sit and do nothing. That allowed the team to isolate which neurons were firing when vocalizing the text. Then a visual representation of the sound waves is created and those sounds are matched with particular brain patterns. Then while the participants read silently to themselves the decoder is able to reconstruct the words based purely on what neurons are firing.
Of course, the technology is far from perfect. While the results were described as "significant" a reliable device that can translate thoughts in to words is a long way off. But the team from Berkeley is optimistic that one day they'll be able to give the gift of speech to someone who is paralyzed or "locked-in."
[Image credit: Science Photo Library - SCIEPRO/Getty Images]