"We do see the technology starting to be used in some ways that are more like involuntary neural surveillance."
We're closer than ever to a dystopian future in which corporations can read our thoughts without our permission — and a leading legal theorist thinks we should head that eventuality off before it becomes a reality.
In an interview with The Guardian, Duke Law professor and brain-hacking "neurotechnology" critic Nita Farahany said that although brain-computer interface tech "can’t literally read our complex thoughts" as of yet, it comes close enough to give her major pause.
"There are at least some parts of our brain activity that can be decoded," Farahany told the British newspaper. "There have been big improvements in the electrodes and in training algorithms to find associations using large datasets and AI."
"More can be done than people think," she added.
Speaking on the futuristic concept of "freedom of thought," the legal expert emphasized that we should start thinking about our rights before brain-hacking technologies like Elon Musk's Neuralink have a chance to become mainstream.
"Applications around workplace brain surveillance and use of the technology by authoritarian governments including as an interrogation tool I find particularly provocative and chilling," Farahany told The Guardian. "We do see the technology starting to be used in some ways that are more like involuntary neural surveillance."
To avoid the most Orwellian of outcomes, the professor proposes the creation of a new civil right, or "cognitive liberty," that should be accompanied by updates to other integral freedoms like "privacy, freedom of thought and self-determination."
Given both that the digital surveillance of workers is already becoming a more common business practice and that current brain-monitoring tech already can detect "your level of fatigue, engagement, focus, boredom, frustration and stress... with high accuracy," according to recent research, more dystopian outcomes are becoming increasingly plausible.
Cognitive liberty standards would, Farahany told The Guardian, "protect our freedom of thought and rumination, mental privacy, and self-determination over our brains and mental experiences."
"It would change the default rules so we have rights around the commodification of our brain data," she added. "It would give people control over their own mental experiences and protect them against misuse of their brain activity by corporate and government actors, weighed against societal interests."
As freaky as that kind of future sounds, it's one that we need to seriously take into consideration — especially given the corporations' vested interest in getting inside of our heads.
More on brain-computer interfaces: Brain Chips Like Neuralink Cause Strange Cognitive Changes, Doctors Say