Trials of inaccurate facial recognition tech continue in London.
Face Facts
Don't want your face scanned by London police? You might want to rethink where you go in the city.
On Friday, the Metropolitan Police Service (Met) announced it will use facial recognition technology to scan the faces of people in Central London for eight hours on both Monday and Tuesday as part of an ongoing trial. The Met claims declining the scanning won't raise suspicions with officers — but it isn't exactly clear how people can decline in the first place.
About Face
Based on a Met webpage detailing its ongoing trial of facial recognition tech, mounted cameras scan the faces of passersby and then compare those scans to a database of faces of people wanted by the Met or UK courts. If the system detects a match, it alerts a police officer on the ground. That officer then reviews the match and decides whether to stop and speak to the suspect.
"Anyone who declines to be scanned during the deployment will not be viewed as suspicious by police officers," the Met wrote in its announcement of this week's trials. "There must be additional information available to support such a view."
How exactly does a person even decline this scan? The Met claims it's being open about its deployment of the facial recognition technology — police are passing out leaflets about it to the public and displaying posters in the area.
But if the cameras are mounted and set to automatically scan faces, is avoiding the area altogether the only way to "decline" the scan? And if that's the case, are the police simply saying they won't be suspicious of anyone who turns and walks the other way when they see these posters announcing the trial — in other words, they won't be suspicious of anyone who wasn't within the tech's line of sight anyways?
Not Ready
This is the Met's seventh facial recognition trial since 2016, and it plans to conduct a total of 10 before evaluating the technology at the end of 2018.
Privacy groups have decried the trials both for their general infringement on public privacy and for the terrible inaccuracy of the Met's tech — a Freedom of Information request regarding a June trial revealed that 1oo percent of matches flagged by the Met's system were false.
This is far from the first example of facial recognition tech's shortcomings, and yet that doesn't seem to be slowing the technology's adoption — even Taylor Swift is using it now. However, if police are going to insist on continuing to test the technology, they really need to figure out a better way to allow the public to decline consent.
READ MORE: Facial Recognition Technology to Be Deployed on 17 and 18 December [Metropolitan Police]
More on facial recognition: Even Taylor Swift Is Using Facial Recognition Tech Now
Share This Article