"We should all be able to use our public spaces without being subjected to oppressive surveillance."
After a drawn out legal battle, a London court deemed it unlawful for police to use automatic facial recognition to surveil crowds.
The court ruled that the South Wales Police's facial recognition tools had too few controls on them, BBC News reports, and went even further to note that police failed to ensure the facial recognition algorithm didn't perpetuate racial bias. Even if a proper use of facial recognition theoretically exists somewhere out there, the South Wales Police have too few safeguards in place to justify using it to the courts.
Playing The Odds
Automatic facial recognition entails setting up surveillance cameras, equipping them with facial recognition software, and crosschecking everyone it sees against a database of mugshots or other police files. Unfortunately, it doesn't work very well: During the 2017 Champions League final (yes: soccer), the system made 2,297 errors, flagging people that weren't actually in the mugshot database at all.
"I'm delighted that the court has agreed that facial recognition clearly threatens our rights," plaintiff Ed Bridges told BBC News. "This technology is an intrusive and discriminatory mass surveillance tool."
Read The Room
Meanwhile, the South Wales Police tells BBC that they still want to continue using the surveillance tech, spinning a court literally calling it unlawful as an "important step in its development."
"For three years now, South Wales Police has been using it against hundreds of thousands of us, without our consent and often without our knowledge," Bridges told BBC News. "We should all be able to use our public spaces without being subjected to oppressive surveillance."
READ MORE: Facial recognition use by South Wales Police ruled unlawful [BBC News]
More on facial recognition: Welsh Police Used Face Scanning Software That Incorrectly Flagged Thousands