Last month, a Colorado woman named Chrisanna Elser answered her front door and was greeted by a police officer named sergeant Jamie Milliman, who issued an ominous warning about his neighboring town of Bow Mar.
“You know we have cameras in that jurisdiction and you can’t get a breath of fresh air, in or out of that place, without us knowing, correct?” he kicked off the conversation, according to alarming reporting by local media site Denverite.
Milliman was convinced that Elser had stolen a package off someone’s stoop. As evidence, Milliman had obtained records compiled by Flock, a controversial police surveillance startup that’s taking the United States by storm.
As a display of the department’s technological panopticon, Milliman noted the woman had driven through Bow Mar “20 times the last month.”
“Like I said, we have cameras everywhere in that town,” the officer reiterated.
Specifically, the officer alleged that Elser had been in Bow Mar from “11:52 until 12:09 exactly” on the day the package was stolen. “Like I said, nothing gets in or out of the town without us knowing,” Milliman said. Though the officer refused to show her any concrete evidence, he ended their chat by serving her a December court summons for petty theft.
“From then on, I made it a mission every single day,” Elser told Denverite. “I couldn’t believe [that this was] holding over my head until December, and my bosses and my work — to be worried and also to waste everybody’s time on this.”
Thankfully, Elser had her own surveillance panopticon: the suite of consumer devices tracking her every move. She got surveillance images from the one store she went to on the day in question, as well as footage from the dashcam of her truck, which Denverite says showed her drive through Bow Mar twice without stopping. She also compiled the truck’s GPS records, as well as her phone’s, and provided home security footage of her confrontation with Milliman to Denverite.
As far as the Denver woman could tell, the only reason she was accused was because her vehicle had been registered by Bow Mar’s Flock cameras at the time the incident occurred. After viewing Elser’s evidence, the police department dropped the summons, telling her via email “nicely done btw.”
It’s infuriating that the first time Milliman talked to Elser wasn’t to gather evidence or hear her side of the story, but to accuse her of a crime which he had already finished investigating.
Elser managed to head off the false accusations. But for Black people and other minorities — already the subjects of violent over-policing — this kind of technological monitoring can have devastating consequences.
In one recent example, a Black 16-year-old high-school student was violently detained at gunpoint by a small army of police after a school AI-system misidentified his bag of Doritos for a gun. In this case too, police were eager to take the algorithm’s word as evidence enough to terrorize the innocent student.
Previously in New York, a Brooklyn father named Trevis Williams was forced to spend two days in jail after the NYPD’s contentious facial-recognition software pegged him as a “possible match” for a crime that had happened two months earlier. (He was also miles away from the crime when it happened, of course.)
What’s clear is that without regulation, the marriage of error-prone AI and overly aggressive policing will keep producing new victims.
More on policing: Florida Unleashes Autonomous Police Cruisers That Deploy Thermal Imaging Drones