Government officials knew about the problem and decided it was fine.

"Sufficient"

The U.K. government uses facial recognition AI to check travelers' photos when they apply for passports. It works just fine for white people, but like so many algorithms out there, it doesn't work well when presented with dark skin.

Anti-black bias in tech is nothing new, unfortunately: algorithms trained on biased data have often resulted in software that perpetuates prejudice. What's particularly troubling about this passport photo AI is that the British government knew about the problem, according to New Scientist — but decided it was okay to deploy the system anyway.

Afterthought

Newly-released internal documents revealed that the same racial disparities occurred during testing, resulting in the system telling darker-skinned people that their pictures didn't comply with passport guidelines, New Scientist reports.

"User research was carried out with a wide range of ethnic groups and did identify that people with very light or very dark skin found it difficult to provide an acceptable passport photograph," read the documents. "However; the overall performance was judged sufficient to deploy."

Algorithmic Racism

If someone applying for a passport is told that their photo isn't acceptable, they can still circumvent the AI system and submit it anyway — but they'll face warnings that it could interfere with their application.

"Even with the user being able to override the selection, it is still creating a — largely racialized — disparity in experience between users," University of Washington engineer Os Keyes told New Scientist.

READ MORE: UK launched passport photo checker it knew would fail with dark skin [New Scientist]

More on facial recognition: Google Contractors Tricked Homeless Black People Into Face Scans


Share This Article