In BriefBrain-scanning technology has advanced to a point where people now need to address its potential impact on their privacy. The tech has already been used to try a woman for murder.
Your thoughts are your own, are they not? Despite what you say or do in public, privately around others, or what information you share online, the thoughts, ideals, and actions contained within your own mind should be yours, and yours alone. However, advances in technology dealing with brain-imaging/brain-scanning, developed in order to study and interpret the human mind, may be encroaching on this way of thinking, and posing the question: is it our right to control our own thoughts and processes?
Believing that we should be the ones in control of our minds is known as cognitive liberty, or the right to mental self-determination. Ask just about anyone the question posed above, and most would agree that yes goes without saying. However when asked the same question with regards to their potential benefits in medicine and law, the answer may be different. If a brain scan could prove someone committed a crime, wouldn’t it be in everyone’s best interest to use it?
India, in 2008, decided on that very thing: to utilize such neurotechnology in a court case involving a woman convicted of murder. The judge, citing the woman’s brain scan, concluded that she has “experiential knowledge” about the crime that only the killer could have, thereby leading to her life sentence in prison. At the time, the decision and use of brain scan was heavily debated. The technology was still new and not entirely well-proven.
Now, nine years later, the tech had advanced even further, and we as a society are more willing than ever to give over parts of ourselves…provided the results are worth it.
Forgoing Our Rights
Shopping would certainly be easier if companies knew your likes, dislikes, and desires; applying for a job would probably be easier as well, at least from the company’s perspective. Santa Fe Institute (SFI) CEO and complexity theorist David Krakauer, speaking with Forbes, spoke on the advancements of technology, and noted how lazy people can be when it comes to making decisions or using effort; how people are more likely to give up some form of control to technology to make things easier for themselves.
“What I worry about almost more than anything else is a certain kind of mental laziness, and an unwillingness to engage with the difficult issues…. It’s somehow more pressing in a time where there are systems out there willing to make the decisions for you,” said Krakauer.
Of course, brain-scanning tools wouldn’t just be used on the general public, or by those in the aforementioned medical and legal areas. The Scientific American notes that even the military is experimenting with brain monitoring, in this case to increase a soldier’s alertness and perceptual acuity. It’s another benefit that comes with neurotechnology, but the risks and unintended consequences need to be addressed. It was only last month that Chinese neuroscientists revealed they could change the proclivity of mice to be more dominant or submissive by promoting activity in varying parts of the brain. Should brain-scanning proliferate, misuse is sure to follow.
A Conversation About Cognitive Liberty
Brain-scanning and the tools that make it possible can actually benefit society, but the potential dangers cannot be ignored. An open conversation needs to be had about cognitive liberty, and it needs to be between legal experts, neuroscientists and, most importantly, the people whose lives will be effected by its successful application.
It’s not difficult to imagine scenarios where having neurological data relating to an illegal event could work against a defendant. Oppressed populations will have a very different and extensive chain of mental associations of, say, police and violence, than a non-minority charged with the same crime. This is to say nothing of how the technology would be applied, and the intent (or bias) of those whose adjudicative process includes brain-scanning as evidence. Clearly the implications go far beyond science.
Neuroethicist Paul Root Wolpe said it best at the 2015 World Science Festival: “This is something, as a society, we’re going to have to work out, but I really believe that it’s going to all happen very soon, and that it’s very important for everybody to think about where they would want [the limits of cognitive privacy] set.”