"Shoddy technology makes shoddy investigations."

About Face

Imagine you're in Porcha Woodruff's shoes back in February — heavily pregnant, getting two kids ready for school in the morning, when suddenly: Detroit police show up at your door, and arrest you for robbery and carjacking.

The problem? Woodruff never committed those crimes. She was falsely identified by facial recognition software. It all prompts the usual big questions about the limits of technology interceding with civil rights, and the ages-old philosophical dilemma of the value of justice if a single innocent person ends up wronged. And of course, after being arrested in front of her children, Woodruff spent several hours in jail suffering from contractions and dehydration. She's now suing the city for wrongful arrest, according to The New York Times.

In total, the NYT reported, Detroit's been hit with three lawsuits on false arrests made due to AI-powered facial recognition software. Woodruff's case is the sixth reported — all black people — to have suffered this same fate in America since the technology's usage among law enforcement started in earnest.

"Shoddy technology makes shoddy investigations, and police assurances that they will conduct serious investigations do not ring true," American Civil Liberties Union at Michigan senior staff attorney Phil Mayor told the NYT.

Shoddy Tool

Woodruff's false arrest followed a man reporting to Detroit police that he'd been robbed at gunpoint at a gas station, according to court documents. Police found camera surveillance footage of a woman connected to the incident and ran this person's face through a facial recognition vendor called DataWorks Plus.

The software program produced Woodruff's name after a search, and the robbery victim picked her out from a photo lineup of six woman, the lawsuit states, while noting that the picture of Woodruff' used in the lineup is eight years old. Needless to say, Woodruff's charges have since been dismissed.

"You've got a very powerful tool that, if it searches enough faces, will always yield people who look like the person on the surveillance image," argued Iowa State University psychology professor Gary Wells, an expert on the reliability of eyewitness identification, to the NYT.

Again, this isn't the first instance (let alone type of instance) in which AI's been used by law enforcement to erroneously target people, mostly minorities. Assuming the widespread adoption of this technology's use continues at its current clip, an increasingly relevant question is just how much money law enforcement departments are willing to spend — not for the wonky technology itself, but the inevitable litigation on behalf of the innocent people it will (at this rate) continue to net.

More on facial recognition: Facial Recognition Used to Evict Single Mother for Taking Night Classes


Share This Article