brain control

Latest

  • ASSOCIATED PRESS

    Facebook is working on its own operating system

    by 
    Christine Fisher
    Christine Fisher
    12.19.2019

    In an attempt to free itself from other tech giants like Google, Facebook is developing its own operating system (OS), The Information reports. In the future, Facebook's hardware products, like Oculus and Portal devices, could run on the OS, Facebook exec Ficus Kirkpatrick said.

  • New system lets you type with your brain using MRIs

    by 
    Terrence O'Brien
    Terrence O'Brien
    06.29.2012

    This isn't mind reading, per say. Instead Bettina Sorger, Joel Reithler, Brigitte Dahmen and Rainer Goebel at Universiteit Maastricht have figured out a way to monitor the flow of blood in the brain and associate the images captured using an MRI with the letters of the alphabet. The whole system takes about an hour to learn and configure for each individual. Trials focused on healthy individuals, but clearly its the paralyzed and people suffering from diseases like ALS that have the most to gain. Sorger hopes to enable "locked-in" patients to finally be able to communicate with the outside world by thinking out letter at a time. Obviously, patients aren't going to be able to install an MRI in their homes, much less lug one around with them. The data collected could be used to finely tailor less accurate but more portable systems for patients that monitor electrical or light signals. If you're interested in the real nitty-gritty you can check out the complete research paper at the source link.

  • IBM says mind control next big thing in human-computer interaction; GLaDOS offers cake reward

    by 
    Jason Hidalgo
    Jason Hidalgo
    12.20.2011

    From the ongoing Google-Apple turf war on voice recognition to Microsoft's gesture tracking with Kinect, humans continue to push the envelope on how to interact with computers and devices. Now IBM says mind control will be the next field to see a big leap, predicting breakthroughs within the next five years. Keep in mind that they're not talking about controlling humans a la Gorilla Grodd ... yet. Instead, they're talking about controlling computer actions and devices via brain waves. IBM software guru and potential Borg recruit Kevin Brown (pictured right) has already been using a headset to move cubes on a computer screen at will. Given the ongoing progress with mind-controlled cars and BrainGate, IBM's prediction might not be too far-fetched.

  • Test subjects with electrode implants use mind control to move a cursor

    by 
    Dana Wollman
    Dana Wollman
    04.08.2011

    As trippy as mind-control still seems to us, we've already seen it implemented in everything from wheelchairs to pricey gaming (and car driving!) headsets. But the problem is that they measure brain activity outside the skull -- you know, the thing we've evolved to shield the murky goings-on in our minds from prying EEG sensors. Now, though, a team of Washington University researchers appears to have happened upon a more effective -- albeit, invasive -- approach. The researchers got some brave specimens to move a mouse cursor by implanting plastic pads containing electrodes underneath their skulls, with the sensors sitting on the surface of the brain. That, they say, gives them access to more telling, high-frequency waves that say a lot more about cognitive intentions. In the end, the subjects moved the cursors by thinking one of these sounds: "ee," "ah," "oo," and "eh." Brain-computer interfaces ain't new, of course, but the scientists say the subjects with electrode implants had more success than people wearing electrode-studded EEG caps, which could translate to less frustration for people with severe disabilities.

  • BrainGate hits 1,000 day mind-control milestone, nearly three years of pointing and clicking

    by 
    Christopher Trout
    Christopher Trout
    03.28.2011

    Aspiring Svengalis rejoice! For BrainGate has reached a significant landmark in computational thought-control -- the 4 x 4-mm implantable chip has given a woman with tetraplegia the ability to point and click with her brain for 1,000 days. An article recently published in the Journal of Neural Engineering said the woman, known simply as S3, performed two easy tasks every 24 hours, using her mind to manipulate a cursor with 90 percent accuracy. Each day she was monitored, S3 would post up in front of a computer and continuously command the thing with her thoughts for 10 minutes. Functionality reportedly deteriorated over time, but the paper points to the chip's durability, not sensor-brain incompatibility, as the culprit. Research is currently underway to incorporate BrainGate into advanced prosthetics that could get tetraplegics like S3 up and moving again. Now, how's that for the power of positive thinking?

  • Emotiv EEG headset hacked into VR trapeze act, lets you fly like Superman (video)

    by 
    Sean Hollister
    Sean Hollister
    03.01.2011

    Last year, Rensselaer Polytechnic Institute students built a virtual reality contraption that let them soar through the sky, held aloft by a trapeze harness and seeing through HMD-covered eyes. This year, they're controlling it with the power of their minds. For his master's thesis, project leader Yehuda Duenyas added an Emotiv headset -- the same one controlling cars and the occasional game -- to make the wearer seemingly able to levitate themselves into the air by carefully concentrating. Sure, by comparison it's a fairly simple trick, but the effect is nothing short of movie magic. See it after the break. [Thanks, Eric]

  • German researchers take mind-controlled car for a carefully-controlled spin

    by 
    Donald Melanson
    Donald Melanson
    02.19.2011

    Emotiv's mind-reading EPOC headset may not have changed the face of video games, but it looks like it's proven to be more than adequate for a team of German researchers, who've used it as the key component in their BrainDriver project. Yes, that's a mind-controlled car and, after a bit of training, is does appear to have performed reasonably well -- albeit with a slight delay that makes any real world test a worse idea than it already was. Interestingly, this latest effort actually follows some previous attempts at a completely autonomous car by the same group of researchers at the Freie Universität Berlin, and they say that the two could eventually be combined at some point in the distant future -- for instance, in a taxi that's able to drive itself but also responds to the thoughts of its passengers. Head on past the break for the video.

  • Japan plans mind-reading robots and brain interface devices 'by 2020'

    by 
    Vlad Savov
    Vlad Savov
    04.23.2010

    Our grandparents did warn us that laziness would get us in trouble. The Japanese government and private sector are, according to the Nikkei, all set to begin work on a collaborative new project to develop thought-controlled gadgets, devices ... and robots. The aim is to produce brain-to-computer interfaces that would allow the ability to change channels or pump out texts just with your almighty brain power, while also facilitating artificial intelligence that would be capable of detecting when you're hungry, cold, or in need of assistance. Manufacturing giants Toyota, Honda and Hitachi get name-dropped as potential participants in this 10-year plan, though we wonder if any of them will have the sense to ask what happens when an ultra-precise and emotionless bot is given both intelligence and mind-reading powers. Would it really stick to dunking biscuits in our tea, or would it prefer something a little more exciting?

  • Mind-controlled wheelchair prototype is truly, insanely awesome

    by 
    Laura June Dziuban
    Laura June Dziuban
    05.04.2009

    We've seen brain-controlled wheelchairs in the past, but we've never seen them in action. This one, developed and built at the University of Zaragoza in Spain, uses an EEG cap worn on the head, using a P300 neurophysiological protocol and automated navigation. The user sees a real-time visualization of his surroundings on the screen in front of him, and then concentrates on the space which he wants to navigate to. The EEG detects the location, which is then transmitted to the autonomous navigation system, which then drives the chair to the desired location, avoiding any obstacles that might be in the way. Once the location has been chosen, the user can sit back and relax while the chair does all the work, making the use of the system far less mentally exhausting than some previous iterations which demand constant concentration on the target. Although there is no information about commercial availability of the wheelchair, it has been successfully tested by five different participants in a study. There's a video with a more detailed explanation of its impressive operation after the break.

  • Twitter-brain interface offers terrifying vision of the future

    by 
    Nilay Patel
    Nilay Patel
    04.20.2009

    We'll be honest, we're always on the lookout for faster and better ways to annoy our Twitter followers with hopelessly mundane status updates, and this brain-control interface from the University of Wisconsin's Adam Wilson seems to be the perfect to get all Scoble on it with a minimum of effort -- you think it, you tweet it. Okay okay, we kid -- it's actually just the usual brainwave-control setup you've seen everywhere, and the average user can only do ten characters a minute, but think of the potential, people. Soon everyone will know that you are "Walking on sidewalk, LOL" almost the second you think it, and all it will take is a mindreading cap paired with a sophisticated computers running an advanced signal processing algorithm connected to the massive infrastructure of the internet via a multibillion-dollar mobile broadband network. That's progress. Video after the break.[Via Hack A Day]

  • Honda's ASIMO could be thought controlled in Spaceballs 2

    by 
    Thomas Ricker
    Thomas Ricker
    03.31.2009

    Sorry, that's not actually Dark Helmet, it's a researcher demonstrating the latest Brain Machine Interface (BMI) cooked up for robotics. While it's not looking too portable, it's a far nimbler setup than the original MRI Scanner first concocted by Honda to control robots in near real-time back in 2006. This time, Honda Research Institute in coordination with Advanced Telecommunications Research (ATR) and Shimadzu Corporation have achieved robotic thought control using a sensor cap to measure electrical potential on the scalp and cerebral blood flow. While we've seen much of this BMI tech applied to video games in the past, Honda claims its technology achieves the world's highest accuracy at 90% without special training. Impressive, even though it's clearly R&D work for now. Check the video after the break. [Via Akihabara News]

  • NeuroSky and Square Enix set to demo mind-controlled gaming

    by 
    Joseph L. Flatley
    Joseph L. Flatley
    10.07.2008

    The last time we saw NeuroSky's MindSet brainwave-controlled gaming headset, the company was partnering with Sega -- now the peripheral-maker has teamed with Square Enix to produce what we hope will be a "mind-blowing" (groan) demo at this year's Tokyo Game Show. If you'll recall, NeuroSky has been pushing its unique brand of mind-controlled gaming since way back in 2005, but it appears the technology has become increasingly attractive to notable game-makers as of late. On Thursday, the two companies plan a demonstration of the in-game powers of relaxation at TGS in what appears to be a perfect example of Neo-like godliness. The whole thing goes down October 9th, so be there or be Square. Enix.

  • OCZ's Neural Impulse Actuator gets reviewed, mice everywhere safe for now

    by 
    Darren Murph
    Darren Murph
    07.21.2008

    We tried to take OCZ's Neural Impulse Actuator seriously, we really did. But unable to suppress those recurring images of Geordi La Forge, we simply couldn't help ourselves from having a laugh at this thing's expense. Nevertheless, the way-more-solemn dudes and dudettes over at HotHardware managed to give this brain-computer interface a fair shake, and overall, it was pretty impressed. Still, the bottom line is this: "the NIA is a very unique input device and possibly the first true brain-computer interface to hit the retail market," but it's not "a replacement for traditional input methods." Granted, critics did point out that it would supplement current devices quite well, but only after "slogging through" hours upon hours of training. The hardcore among us may be willing to put in the time necessary to really get a lot out of this; for everyone else, just continue to point and laugh while masking your ignorance.

  • New brain control development could help quadriplegics get around

    by 
    Darren Murph
    Darren Murph
    03.11.2008

    Sure, we've seen brain power used to give mobility back to the immobile, but a new development in Europe is one-upping current efforts by adding in a hint of artificial intelligence to the tried and true brain-computer interface. The MAIA BCI not only converts signals emitted by the brain into actions -- such moving a wheelchair forward -- it also thinks for itself when needed in order to assist the user in getting where he / she wants to go. Essentially, the individual need only think about going left or forward (for example), and the machine itself will automatically detect obstacles and potential barriers in order to move more efficiently. As it stands, there's still quite a bit of testing to be done before MAIA-based wheelchairs would be available to the public, but researchers are already hoping to integrate said technology into artificial limbs and the like.[Via Physorg]

  • OCZ set to launch Neural Impulse Actuator "brain mouse"

    by 
    Donald Melanson
    Donald Melanson
    03.03.2008

    It's certainly not the first to toy around with mind control as a means of fun and games, but OCZ looks like it may be among the first to actually get a product out the door, with it now set to launch its Neural Impulse Actuator "brain mouse," or NIA for short. According to Daily Tech, the device makes use of a combination of EEG readings, muscle movement, and eye movement to control a given application which, in this case, is mainly intended to be games. Needless to say, the contraption will take a little getting used to, but OCZ says that most users will get the hang of it "within hours" after a little practice, and that they'll eventually even be able to increase their reaction time compared to a standard mouse. You'll also not surprisingly need a fairly decent PC, as the NIA has been designed specifically for multi-core systems, and a good bit cash to spare, with it set to run $300 when it launches sometime in the not too distant future (it's going into production next week).

  • Sega Toys, NeuroSky team up for "mind-controlled" toys

    by 
    Donald Melanson
    Donald Melanson
    12.11.2007

    Details are still light on this one, but Sega Toys (makers of freaky robots) and the brain-reading folks at NeuroSky have announced that they've teamed up in an effort to develop what they're only describing as "mind-controlled tech toys," which they say will "take 'play' to the next level." Those unspecified toys will apparently make use of NeuroSky's ThinkGear bio-sensor technology which, according to the company, uses "dry active sensors" that eliminate the need for contact gels while also maintaining a small form factor. Given the vagaries of the announcement, however, we wouldn't expect the toys to be hitting store shelves anytime soon, but you can be sure that there'll be plenty of attempts to "repurpose" them whenever they do.[Via Gadget Lab]

  • U of W researchers show off brain-controlled humanoid bot

    by 
    Paul Miller
    Paul Miller
    12.15.2006

    We were fairly certain we'd reached the apex of human-computer interfacing when that 14-year old kid started blasting Space Invader baddies using only his mind, but we were oh so very wrong. Some University of Washington researchers have managed to jack a grad student into their humanoid robot and perform minor tasks. The big news is that this time, unlike that MRI-based Asimo control we saw earlier this year, the brain waves are being read by a mere cap with 32 electrodes on it, meaning the project uses much messier brain data to control the bot. Because of the type of brain readings they're getting, the bot is semi-autonomous, using human control for making the decisions based on video cameras, but managing the actual mechanics of the motions on its own. Right now the bot can only manage to pick up simple shapes and move them to another location, but the eventual goal is a human-controlled robot that can function in human environments, learn from its surroundings and perform meaningful tasks for its human masters.

  • Hitachi creates brain-controlled model railroad

    by 
    Donald Melanson
    Donald Melanson
    11.18.2006

    Compared to other advances in the world brain-controlled interfaces, Hitachi's latest development may seem a tad unimpressive, basically amounting to a thought-controlled switch. But, connect that switch to a model train set and you've suddenly got something that's a heckuva lot more impressive -- at least on first glance. To get that train rolling, Hitachi uses optical topography to map changes in blood flow in areas of the brain associated with mental activity, translating those changes into voltage signals to flip the switch on and off. Of course, Hitachi eventually sees the tech allowing for a much greater degree of control, with one of the goals being to help paralyzed patients become more independent. They also seem to think they're on the fast track towards commercializing the technology, saying it could be available as soon as five years from now.[Via Pink Tentacle]