Sketchy Behavior
In 2017, the sheriff's office in Washington County, Oregon, became the first law enforcement agency to use Rekognition, Amazon's facial recognition tool. Then, for two day in March 2019, the office granted journalists for The Washington Post access to its squad cars and facilities to observe how the technology was transforming cops' operations.
What those journalists discovered was that officers were using Rekognition in ways far beyond what Amazon intended, including to identify unconscious suspects, people who refused to identify themselves, dead bodies, and even police sketches of faces — a startling example of how the tech is already being misused by law enforcement in the wild.
No Confidence
Amazon advises law enforcement officials to use Rekognition’s results only if the tech is 99 percent confident it has found a match. The company also told WaPo that while running police sketches through Rekognition doesn't violate any rules, Amazon expects the human reviewing the results to "pay close attention to the confidence of any matches produced this way."
But according to WaPo's detailed reporting, published Tuesday, the Washington County officers never even see the search-confidence metric when they use the system. Instead, they're presented with the five most likely matches for a search regardless of Rekognition's confidence in the matches — and regardless of whether the input is a photo or a sketch.
Lousy Leads
Long story short, the number of potential problems with the way the Washington County Sheriff's Office is using Rekognition are staggering.
AI experts told WaPo that the use of sketches in lieu of photographs could lead to more false matches. Officers might then try to force their investigations to align with those false matches, according to Marc Brown, chief deputy defender with Oregon's Office of Public Defense Service.
"You’ve already been told that this is the one, so when you investigate, that’s going to be in your mind," he told WaPo. "The question is no longer who committed the crime, but where’s the evidence to support the computer’s analysis?"
Ethical Violations
Besides the practical issues involved in police following dodgy leads — and then potentially making false arrests — there are also civil rights issues to consider.
By using Rekognition on unconscious suspects or those that refuse to identify themselves — an act that is not illegal in the state of Oregon — the Oregon officers are stripping those people of their ability to consent to the scans.
Ultimately, if there's one thing to be learned from WaPo's investigation, it's that until legislators start regulating facial recognition, law enforcement officials are going to use the technology however they see fit — even if it means arresting innocent people or violating their civil rights.
READ MORE: Oregon became a testing ground for Amazon’s facial-recognition policing. But what if Rekognition gets it wrong? [The Washington Post]
More on Rekognition: Cops Are Using Amazon’s Facial Recognition Software Wrong
Share This Article