Driven Nuts

Amazon Accused of Trapping Drivers in AI Panopticon

"Amazon would get rid of drivers altogether if they could."
Joe Wilkins Avatar
Amazon is accused of controlling workers through a surveillance dragnet, while skirting responsibility for their wellbeing.
Illustration by Tag Hartman-Simkins / Futurism. Source: Kevin Carter / Getty Images

As work goes, it was a pretty typical shift for Johnny. An Amazon delivery driver, today’s route took his small cargo van through the wooded hills of Big Bear Lake, the rural outskirts of San Bernardino, California. There was just one caveat: his deliveries were bringing him dangerously close to the maw of a raging wildfire.

Assessing his itinerary on the Amazon Flex app as smoke billowed around him, Johnny realized that his entire delivery route was ablaze. He radioed in to dispatch about the obvious dilemma, assuming they’d call him back to the warehouse.

Instead, Amazon told him to buck up and do his best.

“Amazon said, ‘I know it’s not safe, deliver what you can,'” he later recalled. “But [Amazon] still sent us to an active fire to deliver.'”

If he refused, Johnny knew he could face repercussions: being forced to undergo training on his own time, having his hours cut, or even losing his job altogether. The suite of software on his phone — made up of the Amazon Flex app, eDriving Mentor app, and the DSP Workplace app — combined with the van’s onboard sensors, routing software, and AI-powered cameras, constantly monitor his work, alerting Amazon to any deviation that might require correction.

Yet when it comes to reporting safety violations, hazards, or routing errors like the one bringing him into harm’s way, the tech panopticon surrounding the nearly 400,000 drivers like Johnny fall decidedly flat, like when he was urged back into the path of the blazing wildfire.

The issue of Amazon driver surveillance and worker welfare was the subject of a lengthy investigation by the Distributed AI Research Institute (DAIR), a research group led by former Google ethicist Timnit Gebru.

In its latest report, DAIR interviewed Amazon delivery drivers from around the United States — including Johnny, though all names were changed to protect them from retribution — to investigate how algorithmic management and workplace surveillance allows the company to exploit drivers in an uncompromising surveillance dragnet while skirting responsibility for their safety and wellbeing.

One of the key problems facing Amazon’s bottom line is its reliance on human labor. Until it can completely automate delivery drivers — a goal it’s currently throwing tons of money at — Amazon works through a network of Direct Service Partners (DSPs), third-party contractors who do the job of hiring, managing, and firing delivery drivers. This shifts Amazon’s cost structure, relieving it of the costly burden of paying for driver benefits, uniforms, or equipment.

For as much money as it saves, this arrangement also presents a challenge for Amazon when it comes to exercising total authority over those who don the blue vests. Speaking to Futurism, DAIR director of research Alex Hanna said Amazon’s tech panopticon is “central to keeping control” over its workers.

“Amazon is more interested in treating workers like automata rather than human beings,” Hanna said. “Amazon would get rid of drivers altogether if they could.”

For now, its arsenal of paternal devices give it the next best thing.

***

As tech has improved over the years, so too has Amazon’s deployment of invasive tech. One of its more recent additions to DSP vehicles, for example, is the Netradyne AI-camera system. Amazon says the gadgets are installed for employees’ safety, but drivers tell a different story.

For one thing, the Netradyne cameras have a habit of penalizing drivers at random. “Sometimes the camera would glitch out and tell me I’m distracted when I’m looking straight at the road,” Johnny said, according to the DAIR paper.

Even when Netradyne does work as intended, it can feel arbitrary. Drivers interviewed by DAIR reported not being told how to mitigate automatic infractions, so learning the system’s quirks becomes an aggravating game of trial-and-error. Many of the lessons are dangerous.

“You’ll be going through a light that turns yellow and they want you to immediately slam on your brakes so that you don’t risk running a red light, because if you run the red light you get hit on Netradyne,” a driver named Madeline told researchers. “No matter what cars are behind you, they want you to slam your brakes.”

In many DSP vehicles, the number of cameras onboard means workers have virtually no privacy. Because drivers are under intense pressure to hit a huge number of delivery stops, they often face difficult situations when they need to use the restroom or change a tampon during a typical shift. Amazon’s ever-watching cameras have worsened a well-known issue in which workers resort to urinating in bottles in order to hit their delivery quota.

“It takes girls even longer when it’s that time of the month,” Elizabeth said of Amazon’s notorious bathroom problem. “So we just kind of deal with it in the back, especially if we have a heavier route. So when I figured that out [cameras in the back of the vans] I was like Goddamn it! I actually have to go to the bathroom now.”

***

With so many devices keeping watch, it’s not uncommon for the tech to give drivers conflicting feedback.

“There was a semi truck. I guess the guy fell asleep,” a driver named Pablo told DAIR in the report. “He was driving the wrong way in my lane, towards me. I had to break really hard, really, really hard. And the guy, I guess he finally opens his eyes. He made his truck get out of the way and he rolled over. Next thing I know, I got dinged for hard braking.”

Pablo immediately received a scolding text from his DSP, who he told to check the Netradyne camera. “That’s what saved me,” he said. But another system — the Mentor app — penalized him.

“My [Mentor] score [went] from 850 to 650 or something,” Pablo continued. For drivers, Mentor is the app responsible for scoring road safety based on GPS. “I said to my manager, ‘So what do I do? How do I change my score? Because it wasn’t my fault,” Pablo said. His DSP manager — who’s also liable to lose payment or even the entire Amazon contract if their employee scores dip too low — had no idea.

“So next time I’m just gonna crash to keep my score [at] 850,” Pablo recalled. “That’s basically what you are telling me.”

Other drivers also reported being dinged for perfectly reasonable behavior, like drinking water in the cabin, or braking suddenly when traffic hazards pop up out of nowhere.

Johnny, for instance, says he was eventually fired after the seatbelt sensors went down on his DSP’s fleet of vans. Though his DSP made Amazon aware of the problem, the resulting hits to his overall safety score apparently pushed him beyond the company’s cut-off point.

“The seatbelt sensor went off on me seven times supposedly,” he said. “But I had my seatbelt on. They fired me a few days later through a phone call and voicemail.”

The gulf between these systems highlights a contradiction between Amazon and DSPs. When drivers deliver packages, Amazon expects blazing fast speeds and precision. DSPs, however, have an incentive to keep turnover low and drivers safe — if not for the drivers themselves, then at least for their insurance. Those that do are often overruled by Amazon, or end up being dropped altogether.

“Our DSP owners will put rules, and sometimes Amazon wouldn’t like it. They would go over our DSP and fire us,” Johnny told DAIR. “They would say, ‘Oh, Amazon fired this person for some sort of reason.’ And, we would be like okay that’s BS, but all right. We kind of had to be a savage to survive this job.”

***

That these technological devices often make drivers’ lives harder isn’t lost on Amazon, which has a few channels for workers to voice their concerns or complaints. The problem, explains DAIR’s Adrienne Williams, the lead author of the report and a former Amazon driver herself, is that those systems aren’t designed to be taken seriously.

“They did have a question that came up on our Rabbit scanner phones at the end of our shift that asked if we experienced any problems or dangers on our routes that day,” Williams explains. “I always filled out that question. I never saw any response or any change to dangers discussed.”

Not long ago, Amazon began leaning on a system called Project Cheetah, a webform meant to escalate driver and DSP complaints for the company to address. Williams had left the job before she could experience Cheetah, which she described as a “red herring” designed to keep tabs on disgruntled workers interested in unionizing (Amazon routinely drops entire DSPs worth of drivers for committing that sin). Workers familiar with the system say she isn’t missing much.

“You would scan your badge and it would ask you questions,” Johnny said. “I never did it. It never worked.” Elizabeth told DAIR it can take months for Amazon to solve an issue flagged to Cheetah, if they ever do at all.

While workday complaints can take the form of faulty addresses or poor routing, they can also encompass extremely reasonable demands for driver’s personal safety. Either way, Amazon’s response tends to be the same.

“We’ve dealt with Nazis on route and they pulled guns on us many times,” Johnny recalled in the report, speaking of his time delivering in the hills of California, which have a history of playing host to extreme right-wing groups. “Amazon didn’t blacklist their house. They told us they would, but they didn’t. We had a guy. He was Jewish. He wore his [Star of David]… and the Nazi saw that, and walked out with a gun and held him at gun point.”

There are countless other stories, all of which stem from the inherent contradiction bubbling within Amazon: it has a material interest in completely automating labor, yet current technological limitations prevent it from doing so.

Failing that, its next most lucrative move is to harangue drivers into working at the whim of algorithms, one layer removed from Amazon via the DSP arrangement, yet constantly monitored and harassed by an ever-expanding arsenal of company tech. 

“Everything Amazon does when it comes to workers is to either maximize efficiency to physically dangerous, unrealistic rates or to manage dissent and reduce pushback,” Williams explains. “All while claiming not to be the true employer of their delivery drivers.”

To improve Amazon’s exploitative practices and prioritize worker safety and wellbeing, DAIR has five recommendations. First, Amazon must be held accountable for its workers directly, as was also argued by the National Labor Relations Board in Southern California. Surveillance tech must also be highly curtailed, and algorithmic route planning should be strictly scrutinized by humans, if not planned by humans altogether.

On top of this, drivers must be able to voice their complaints against problem customers, without fear of penalties, and with assurances that steps will be taken to remove those people from the customer list. Finally, there must be human oversight of customer complaints, as opposed to the current system, in which algorithms determine penalties and drivers have no chance of pleading their case against a multi-faceted panopticon.

Unfortunately, acting on any of those recommendations would mean Amazon putting workers before profit — something it has no incentive to do anytime soon.

More on Amazon: Secret Plans Reveal Amazon Plot to Replace 600,000 Workers With Robot Army

Joe Wilkins Avatar

Joe Wilkins

Correspondent

I’m a tech and transit correspondent for Futurism, where my beat includes transportation, infrastructure, and the role of emerging technologies in governance, surveillance, and labor.


TAGS IN THIS STORY