Facial recognition: important safety measure or dystopian nightmare?

No matter how you see it, the fact is that police have it at their fingertips. AI-powered algorithms should, in theory, be able to identify persons-of-interest in a crowd with surgical precision.

But for the South Wales police force, these tools are hardly foolproof. In fact, AI has mistakenly identified thousands — thousands! — of people as past criminals.

Take the episode of the final match of the 2017 Champions League (that's "football," or "soccer" in the U.S.). The South Wales police used an AI facial recognition program that compared fans and people in the crowd to a database of mugshots. In that day alone, the algorithm flagged 2,297 people that it shouldn’t have. That accounted for 92 percent of all the people identified as criminals by the program, according to The GuardianThe system did allow the police to make one arrest at the game, a man who violated his parole and had skipped out on returning to prison.

The police responded as expected, saying that any developing technology will have some kinks to work out. They conceding that no system is perfect, and pointed out that, since they started using the technology, they have arrested a total of 450 people. They also blamed the false positives on low-quality mugshots fed into the system, and the fact that this was their first stab at using facial recognition software at a major event.

But all of that begs the question of why the department thought it was a good idea to try this in the first place. Their response suggests that, while they took the time training the algorithm to recognize grainy, low-quality mugshots, no one stopped to think Hey, maybe these terrible photos will place thousands of people in the crosshairs of our new algorithm just because they look vaguely similar.

The South Wales police also said that they value privacy — as anyone using AI to recognize and scan the faces of thousands of unsuspecting people clearly does — and intended its system to be used when there's a high security risk, not to casually surveil citizens.

It is reasonable to assume that the police were only interested in taking action against wanted criminals, since a crowded sports arena could be a potentially devastating target for a terroristic attack. But even if the police weren’t intercepting everyone the system flagged, it was still explicitly designed to recognize anyone with a mugshot on file (and there are a lot on file). That's a lot of information for police to have on people they don't plan on arresting.

If a system like this successfully stops a bomber or shooter from wreaking havoc and claiming innocent victims, then, great. But for Johnny Sportsfan who was arrested for something minor years ago and just wants to get on with their life, security systems like these mean they’re always going to be a person of interest.


Share This Article