When you hear someone else speak, specific neurons in your brain fire. Brian Pasley and a bunch of his colleagues discovered this at the University of California, Berkeley. And not only that, but those neurons all appeared to be tuned to specific sound frequencies. So, Pasley had a thought: "If you're reading text in a newspaper or a book, you hear a voice in your own head," so why can't we decode that internal voice simply by monitoring brain activity. It's similar to the idea that led to the creation of BrainPort, which lets you "see" with your tongue. Your eyes, ears or vocal chords don't really do the heavy lifting, it's your brain. And if you can give the brain another source of input or output you might be able to train it to approximate a lost ability like speech.