Are you ready for "brain transparency?" That's the question posed in a lecture given by Duke University professor Nita Farahany at this year's annual meeting of the World Economic Forum in Davos, Switzerland. And she doesn't mean your head looking like one of those see-through fish at the bottom of the ocean.

Instead, Farahany, a high-profile scholar and legal ethicist focused on emerging tech, rather glibly predicts a future in which corporations and governments will be able to read your mind. In fact, that technology — the "ability to decode brainwave activity" — is already here, she claims.

"We're not talking about implanted devices of the future," she tells her audience. "I'm talking about wearable devices that are like FitBits for your brain," that can pick up your mind's emotional states, simple shapes you may be thinking of, or even faces.

Farahany adds, though, that "we can't literally decode complex thoughts just yet."

To illustrate her vision for the tech, she invokes a tragic vehicular accident caused by a trucker falling asleep at the wheel. If only he were wearing a fancy hat, with embedded electrode sensors that tells his employer on a scale of one through five how alert he was, they could've avoided an accident that was "disastrous for the company and cost many lives" (note the order of priorities).

"Which is why in 5,000 companies across the world, employees are already having their brainwave activity monitored to test for their fatigue levels," Farahany says. She cites mining operations — including one of the biggest mining companies in the world — that have their employees wear hardhat and baseball cap-like devices that detect fatigue. No mention, of course, of alleviating the conditions that lead to overfatigued workers in the first place.

But never mind safety — she quickly pivots into the all-important metric of productivity.

"Surveillance for productivity is part of what has become the norm in the workplace — and maybe with good reason," she avers, citing a survey that found nine out of ten employees admitted to the cardinal sin of wasting "at least some time" at work each day — ample justification for the growing ubiquity of bossware, a type of software that's typically used to surveil what employees (especially those that work from home) do on their computers.

And don't worry: the tech to monitor employees' thoughts already exists, she notes, like ear pods that purport to detect if an employee's mind is wandering, and can even distinguish between the types of tasks they're focusing on, e.g. doing work versus idly browsing the web.

Farahany believes the optimal path forward is a "responsive" workplace where "humans, robots, and AI, work seamlessly together." An example she includes: Penn State researchers who created an overlord robot AI that can monitor stress levels via brainwaves and other metrics in a worker and calibrate the rate they assign them more tasks.

"Done well, neurotechnology has extraordinary promise. Done poorly, it could become the most oppressive technology we've ever introduced in a wide scale across society. We still have the chance to make it right." She acknowledges that it "also has a dystopian possibility."

"But we can make a choice to use it well," Farahany proclaims. "We can make a choice to have it be something that empowers individuals."

Her enthusiasm for this nightmarish tech is offputting, but befitting of an economic forum. Yet perhaps the most sinister thing Farahany presents us with is a false dichotomy, as if our only choices are between employers using brain monitoring technology in an evil way and employers using it in a good way that "empowers individuals." If employees get to choose to opt into using invasive brain tech to hold themselves more accountable, rather than their employer formally requiring them to, then the ethical dilemma is averted. But if employees don't get to make those decisions for themselves now, what makes her think that they will be able to in the future?

Ultimately, her rhetoric and the overweeningly presented dichotomy serve to placate us into accepting a future where the widespread use of increasingly invasive surveillance devices is the norm. Accept it now with naive and vague promises of accountability, and we can avoid a dystopian future. The "choice" doesn't matter. All that matters is that you're willing to embrace the technology, one way or another.

Of course, she's not addressing the working class masses here, but a highly select group of businesspeople, investors, economists, and world leaders who will want to make that "choice" for you. And whether through their own mouths or carefully orchestrated marketing, it'll likely be sold to you using the same rhetoric used here. Better to recognize it now in the hopes of one day making a third choice for ourselves.

More on brain scanning: Companies Already Investing in Tech to Scan Employees’ Brains


Share This Article