The goal of the NESD program is to develop "an implantable system able to provide precision communication between the brain and the digital world," according to a DARPA release. Also known as "wetware", these brain-computer interfaces would effectively convert the chemical and electrical signals from the brain into machine readable data, and vice versa. Ultimately, the program's operators hope that neural interfaces will be able to communicate with up to 1 million neurons in parallel (though still a far cry from the 86 billion that our brains use in total).
Phase I of the program will center around developing the basic hardware and software needed to actually interface with the brain and should take about a year. Phase II will refine and miniaturize that technology as well as begin basic studies ahead of seeking out FDA approval.
The team from Columbia is focusing on the visual cortex and is looking to develop "a non-penetrating bioelectric interface" that could eventually enable computers to see what we see -- or potentially allow human brains to tap directly into video feeds.
The team from the Seeing and Hearing Foundation is also focusing on the visual cortex. They're working on a camera-based, external artificial retina worn over the eye's like Geordi LaForge's visor that would effectively "see" for the blind. Similarly, the team from the JBP lab are developing "modified neurons capable of bioluminescence and responsive to optogenetic stimulation communicate with an all-optical prosthesis for the visual cortex." Basically, again, a giant artificial eye that plugs directly into your brain's vision sensor. Finally, you'll actually have eyes in the back of your head.
The Paradromics team is taking a slightly different tact. They're developing a Strange Days-style neural interface that will use "arrays of penetrating microwire electrodes" to record and stimulate neurons. Eventually the team wants to develop an implantable device that can help stroke victims relearn to speak.
UC Berkeley's plan is especially wild. They want to build a "light field" microscope that can modulate as many as a million neurons simultaneously. With it, they want to determine the firing patterns of specific neuron groupings in response to external stimuli which, in turn, will be used to "elicit sensory percepts in the visual or somatosensory cortices." Basically they'll use these firing patterns to restore vision in the blind and give prosthetic limb users back their sense of touch.
Image: Brown University
For it's part, Brown University is building what it calls a "cortical intranet". Instead of a micro-wire array like Paradromics, Brown's plan is to scatter as many as 100,000 salt grain-sized "neurograins" that will be able to both record and stimulate neurons on a one-to-one basis.
"What we're developing is essentially a micro-scale wireless network in the brain enabling us to communicate directly with neurons on a scale that hasn't previously been possible," Arto Nurmikko, a professor of engineering at Brown said in a statement. "The understanding of the brain we can get from such a system will hopefully lead to new therapeutic strategies involving neural stimulation of the brain, which we can implement with this new neurotechnology."