MIT's Brainput reads your mind to make multi-tasking easier

With so much information readily available at our fingertips, a multitude of devices to access it from and an increasing outside demand for our divided attention, it's easy to short-circuit on the productivity front. But there's a bright spot on the horizon as emerging research out of MIT is poised to help offload the burden shouldered by our overtaxed grey matter with a much needed and intuitive assist from human-robot systems. The Brainput project -- as the collaborative effort is known -- combines near-infrared spectroscopy (fNIRS) with an input system designed to read changes in a user's brain state and translate those signals into an adaptive multi-tasking interface. Sounds like heady stuff, but if successfully implemented into high-stress environments like air traffic control, the low-cost, experimental tech could go a long way to boosting individual performance and reducing overall stress levels. For now, the team still has a ways to go before the system, presently capable of interpreting three distinct mental states, could make its way into end user applications. Curious for a more in-depth, jargony journey through the project's ins and outs? Then click on the source below for your daily dose of scientific head candy.