All of those fantastical possibilities promised by burgeoning brain-computer interface technology come with the unavoidable cost of needing its potentially hackable wetware to ride shotgun in your skull. Given how often our personal data is already mishandled online, do we really want to trust the Tech Bros of Silicon Valley with our most personal of biometrics, our brainwaves? In her new book, The Battle for Your Brain: Defending the Right to Think Freely in the Age of Neurotechnology, Robinson O. Everett Professor of Law at Duke University, Nita A. Farahany, examines the legal, ethical, and moral threats that tomorrow's neurotechnologies could pose.
From The Battle for Your Brain: Defending the Right to Think Freely in the Age of Neurotechnology by Nita A. Farahany. Copyright © 2023 by the author and reprinted by permission of St. Martin’s Publishing Group.
“Passthoughts” as a Gateway to Brain Surveillance
Assume that Meta, Google, Microsoft, and other big tech companies soon have their way, and neural interface devices replace keyboards and mice. In that likely future, a large segment of the population will routinely wear neural devices like NextSense’s bio-sensing EEG earbuds, which are designed to be worn twenty-four hours a day. With wide-scale adoption of wearable neurotechnology, adding our brain activity to nationwide identification systems is a near-term reality.
One of the most extraordinary discoveries of modern neuroscience is the uniqueness of each person’s functional brain connection (its physical wiring), especially in the brain areas devoted to thinking or remembering something. Because of this, algorithms can be used to analyze our brain activity and extract features that are both unique to each person and stable over time. How your brain responds to a song or an image, for example, is highly dependent upon your prior experiences. The unique brain patterns you generate could be used to authenticate your identity.
Nationwide identification systems vary by country but generally involve the assignment of unique identification numbers, which can be used for border checks, employment screenings, health-care delivery, or to interact with security systems. These ID numbers are stored in centralized government databases along with other significant personal data, including birth date and place, height, weight, eye color, address, and other information. Most identification systems have long included at least one piece of biometric data, the static photo used in passports and driver’s licenses. But governments are quickly moving toward more expansive biometric features that include the brain.
Biometric characteristics are special because they are highly distinctive and have little to no overlap between individuals. As the artificial intelligence algorithms powering biometric systems have become more powerful, they can identify unique features in the eyes and the face, or even in a person’s behavior. Brain-based biometric authentication has security advantages over other biometric data because it is concealed, dynamic, non-stationary, and incredibly complex.
The promise of greater security has led countries to invest heavily in biometric authentication. China has an extensive nationwide biometric database that includes DNA samples, and it also makes widespread use of facial recognition technology. Chinese authorities in the Xinjiang Uyghur Autonomous Region have conducted mass collections of biometric data from the Uyghur people and used it for targeted discrimination.
The United States has also massively expanded its collection of biometric data. A recent report by the US Government Accountability Office detailed at least eighteen different federal agencies that have some kind of facial recognition program in place. US Customs and Border Protection includes facial recognition as part of its pre-boarding screening process, and an executive order signed by President Trump in 2017 required the United States’ top twenty airports to implement biometric screening on incoming international passengers.
Increasingly, governments are investing in developing brain biometric measurements. The US Department of Defense recently funded SPARK Neuro, a New York–based company that has been working on a biometric system that combines EEG brain wave data, changes in sweat gland activity, facial recognition, eye-tracking, and even functional near-infrared spectrometry brain imaging (fNIRS), a particularly promising (if expensive) technology for brain authentication, since it is wearable, can be used to monitor individuals over time, can be used indoors or outdoors while a person is moving or at rest, and can be used on infants and children. China has been funneling substantial investments into EEG and fNIRS as well.
For biometric features to be successfully used for authentication, they must have universality, permanence, uniqueness, and be secure against fraud. Over time, static biometrics like facial IDs and fingerprints have become prone to spoofing. Functional biometrics, such as brain activity, are less prone to attack. That feature has motivated researchers like Jinani Sooriyaarachchi and her colleagues in Australia to develop scalable brain-based authentication systems. In one of their most recent studies, they recruited twenty volunteers and asked them to listen to both a popular English song and their own favorite song while their brain wave activity was recorded with a four-channel (an electrode capturing brain wave activity is called a channel) Muse headset. Afterward, the researchers analyzed their recorded brain wave activity using an artificial-intelligence classifier algorithm. Remarkably, they achieved 98.39 percent accuracy in identifying the correct participant when they listened to the familiar song, and a 99.46 percent accuracy when they listened to their favorite song. Using an eight-channel EEG headset on thirty research subjects, another group achieved a similar 98 percent accuracy in authenticating participants by their brain wave data after they’d looked at novel images. It might not even take eight or even four electrodes to achieve the same result. Even with just a single-channel EEG headset, researchers have achieved 99 percent accuracy in distinguishing between participants when they performed the same mental tasks. Most of these studies had a small number of participants; it is not yet clear if neural signatures will be as accurate at scale, when billions rather than dozens of people must be authenticated. EEG is inherently noisy—meaning the signals the electrodes pick up can come from eye-blinking or other movement, which can make it hard to tell the difference between brain activity or interference. But researchers have made substantial progress in developing pattern classifiers that filter noise, allowing them to discriminate between individuals based on their resting-state EEG brain wave activity and when performing tasks. As noted previously, EEG devices have been used to recover sensitive information from a person’s brain, such as their PIN codes, and their political and religious ideologies. Obviously, this poses clear risks to our digital and physical security.
Governments can already tap our phone conversations and snoop on us digitally. Will they similarly tap our brain activity data without our knowledge or consent? Will they deploy AI programs to search our brains for terrorist plots? Will they gather neural data to make inferences about individuals’ political beliefs to predict and prevent peaceful protests? China is reportedly already doing so.