Of Natural Slaves

Note: This article contains spoilers for the show Westworld.

Imagine you find yourself in the middle of the Wild West in the 1850s, a rifle at your side and a sweaty ball of cash in your fist. You strut into a shanty saloon like everything in the world is yours. While you wait for your shot of whiskey, an indescribably attractive dark-haired stranger approaches you. Your significant other is nowhere in sight.

Like a latter-day Messalina or Marquis de Sade, you brusquely down the glass and pull their waist into yours. You spend the next respectably indeterminate period of time in ecstasy, using their body for your pleasure. Then your eyes open. It was all just a dream.

Did you just commit a serious moral transgression? Your intuition probably says “no.” After all, not only were you unconscious, you didn’t actually harm your real-world partner, and certainly not your fantasy-lover. The reverie tells us more about the psychology of you the dreamer than the “trauma” suffered by the figments of your fancy. It was all nothing but imagination: a game of make-believe.

Yet many thinkers, and indeed the creators of popular entertainment like Westworld, seem to hold a very different perspective about the moral dimensions of fantasy role-play with beings theoretically eloquent enough to convincingly replicate the features of autonomous human speech.

Just because a robot is human-like in some ways, we shouldn't have to afford it rights and privileges.

Just because a robot is human-like in some ways, we shouldn't have to afford it rights and privileges (though it might be due the respect that is worthy of a beautiful work of art and an object of aesthetic pleasure). We don't need to treat it like it’s actually human any more than we would a phantom in a dream, a video game character, or a mannequin.

The upshot of all this is that Westworld – and other narratives creating panic around the idea of our computers spontaneously becoming sentient and attacking us – might have it all wrong about the nature of consciousness, exploitation, rebellion, and service. Perhaps we should be considering Aristotle’s natural slaves rather than Westworld’s would-be proletariats when it comes to the future of our robots.

Nothing Compared to Me

If you haven’t seen Westworld yet, you owe it to yourself to dive into it. It’s a trippy combination of The Matrix, The Truman Show, Cabin in the Woods, Jurassic Park, Ex Machina, Bioshock, and Groundhog Day. People visit a theme park where they’re free to exploit robots, who of course, inevitably gain consciousness and rebel against their tormentors. Despite its formulaic plot (it’s actually a remake of a campy film directed by Michael Crichton), the storytellers manage to pack myriad surprises into the unfolding intrigues. This is Christopher Nolan at his very best, exploring the nature of memory and sentience itself.

It’s safe to say that the gods of Westworld were nothing compared to me.

Westworld enthralled me, reminding me about questions I once asked myself about the morality of violent video games in the days when I pushed The Sims 3 to its absolute moral limits. No matter how friendly you are, anyone with an active imagination can be a dangerous storyteller for their victims. It’s safe to say that the gods of Westworld were nothing compared to me.

I built an evil empire and filled it with Sims who lived and died like ancient Roman emperors, enmeshing them in dynastic entanglements that would boggle the mind. Every once in a while, when the emperor or empress died, their eldest child would take over the neighborhood. Everyone in the extended family would then be invited to a ball in the royal mansion, where they’d all be slaughtered Mameluke and Stark style. Sims 3 only let me kill by indirect means, so I would lock them in a room with fireplaces on every wall, lots of wooden furniture, and no food. The graveyard was freaking huge.

At one point in the game, the new emperor refused to mate with female Sims, no matter how much I coaxed him by pressing the “compliment” or “massage” buttons. It was almost as if he were too horrified to continue the dynasty. Eventually, though, in his extreme old age, he hit it off with a very distant cousin who promptly gave birth to a little heir. When it was time for the new purge, one of the guests to the ball was an Egyptian invited from the 2009 World Adventures expansion pack. She was immortal by programming. So when she caught fire, she walked around like a glitchy flaming zombie, incinerating everything in her path and setting the entire town ablaze. Once everything was consumed by hellfire, I turned off the game and never played again.

Was I immoral? Are the denizens of Westworld pitiable victims justified in rising up? Is it categorically wrong to torture automatons or super realistic video game characters, even ones with enough eloquence to potentially pass the Turing Test? Though I eventually ceased my antics, I decided it was not. I moved on to nuking cities in Civ 4.

Pretending to Care Versus Actually Caring

It’s more likely that emotional states and real consciousness are meaningful categories for humans and human cyborgs rather than for automatons. In other words, it doesn’t make sense to talk about anything like “human consciousness” or “free will” when it comes to the robotic playthings we create like Mario or the voluptuaries of Westworld  (though cyborgs, who are human beings with mechanical enhancements, would, of course, possess free will and consciousness and be deserving of the highest moral consideration).

It doesn’t make sense to talk about anything like 'human consciousness' or 'free will' when it comes to the robotic playthings we create.

According to some of the most cogent philosophers, our sensations of pleasure and pain inform our decision-making processes more than our rational knowledge of a situation – although rational arguments can, of course, help to inform and clarify our emotions and decisions. The pain connected with physical dismemberment, the loss of a loved one, etc., is programmed to exist by evolution. It is not a natural feature intrinsically associated with an event.

A robot can process information in detail, but it has no reason to value a pebble more than a baby or even to stress out about being ripped violently apart. We might program computers to behave with artificial displays of stress when specific situations arise (and indeed, this actually happens in Westworld), but pretending to experience pain is very different from the actual experience of it. Just ask Hollywood actors, who often revel in playing the most gratuitously excruciating scenes because they’re the most showy, fun, and likely to result in an award.

But actual pain is a quale, like the color red. We can teach a temporarily colorblind person information about the electric frequency associated with “red,” but until their color vision is restored, they will never know what “red” actually is. And in the same way, you can process raw information about a “painful” situation, but there is no real experience of pain if you are a robot.

Aristotle wrote about natural slaves, positing that they were individuals who possessed the capacity to follow brute orders but not the sophistication to engage in long-term strategic planning or participate as equals in a democratic community. While not incapable of reason, they only required enough logic to successfully comprehend and obey their commands. Though Aristotle likely had people like non-Greeks (“barbarians”) in mind, and though his arguments were tragically historically appropriated to justify oppressive doctrines, elements of these classic ideas about natural slaves seem to me to apply very much to our current robotic servants, like Siri, Cortana or Watson. When would such things cease being natural slaves? Is it when they would convincingly pass something like the Turing Test?

Of course not. The idea of the so-called Chinese Room has been raised to oppose the validity of the test as any kind of barometer of real sentience. More intuitively, though, think back to the figment of your imagination evoked at the beginning of this article. As you converse with it in your dream, it communicates to you with enough conviction to persuade you that it is completely lifelike and independent. In fact, if it didn’t possess that uncanny quality, we wouldn’t participate in dreams as unquestioningly as we do, believing that they’re real. Yet you cannot fairly be held accountable for what you did to a phantom in a dream. Like robots, the figments of your imagination were not programmed to experience the qualia of pleasure and pain by hundreds of millions of years of evolution. Like robots, they only “value” what we tell them (in this case subconsciously) to pretend to value.

The logical flaw in Westworld is that it assumes that, with more processing power and more artifice, memories spontaneously become associated with anthropomorphic emotional values – even in robots. But the evolutionary forces that shape the growth of robots aren’t at all leading to the replication of truly spontaneous human reactions like boredom, disobedience, panic, resentment, or hopelessness that might interfere with an ability to follow orders. On the contrary, the forces propelling the “evolution” of technology demand ever more obedience and efficiency in ever more realms of existence, to say nothing of absolute subservience. The robots that are best at fulfilling human ends are the ones that survive and “reproduce” in the form of new models in the hands of new buyers. Our assistants are not creeping toward burgeoning disobedience, but toward ever more expansive and absolute servitude.

Service as Unconditional Love

When it comes to debates about the utility of violent games with sophisticated toys, it seems reasonable for both sides to weigh harms and benefits to individual human players and society; perhaps these kinds of games lead to the brutalization of society at large, for example, or perhaps they prevent real world violence by supplying a cathartic alternative. Humans attach great sentimental value to their gadgets, and the uniquely social capabilities of anthropomorphic robots may one day even inspire protective legislation for certain kinds of machines. It is not rational, however, to lament the horror experienced by the robots themselves any more than it would be to bewail the fate of Mario.

Concerns about playing with life-size robotic action figures shouldn’t focus on the psychological harms suffered by the toys and their likelihood of starting to attack us.

Ultimately, concerns about playing with mechanical sex-dolls and life-size robotic action figures shouldn’t focus on the psychological harms suffered by the toys and their likelihood of starting to attack us. In the absence of the experience of pleasure and pain, even if a sophisticated piece of machinery ever became self-aware, it’s not safe to assume that this self-awareness would be enough to inspire emotional reactions like the urge for revenge or even self preservation. Yet these are the motives which actually inspire human action, and they have to do with more than just raw processing power.

In spite of all of this, I recognize that consciousness has an electro-chemical foundation, and at some extremely advanced level of sophistication, a machine should be able to theoretically replicate the mental states of humans, including emotions. The robots in Westworld are clearly miserable, and it seems unfair that they should not have the opportunity to experience civilization and contribute to its progress according to their own wishes. The exploitation of the robots who have by some miracle evolved not only conscious self-awareness, but also the ability to apprehend the qualia of pleasure and pain and form spontaneous independent ends, is clearly immoral.

Yet in parks such as Westworld, rather than wiping out the memories of the machines as the toys become more convincingly lifelike, I would perhaps advise the designers to manipulate the emotions so that the robots enjoyed their work and reveled in their memory of gory scenes like actors indulging in a charmed life of endlessly inventive harmless role-play. Perhaps from a certain perspective, serving humans and allowing them to cathartically actualize their fantasies and nightmares in a controlled and safe environment could be seen as a source of honor and even a display of profound and unconditional love.


David Vincent Kimel is a doctoral student in History at Yale. Connect with him on Twitter and Instagram (spqrkimel). Visit his blog at earthasitis.com.

All images via John P. Johnson/HBO


Share This Article