This is a big win for the tech, but it could be a big blow to travelers' privacy.
CAN'T BEAT THE SYSTEM. Well, that didn't take long. During just its third day in action, a facial recognition system used by Washington Dulles International Airport (IAD) caught its first imposter. While that's a clear win for proponents of the tech, it might also be major blow to the privacy of the average airline passenger.
On Monday, 14 airports in the U.S. launched a pilot program to test the effectiveness of a biometric scanning system during the security and boarding processes. Passengers simply stand in front of a camera that takes their photo. The system then compares that photo to the one on the person's passport to confirm their identity.
On Thursday, U.S. Customs and Border Protection (CBP) announced that the facial recognition tech at IAD caught a man trying to enter the U.S. using fake identification papers.
The 26-year-old man, who was traveling from Sao Paulo, Brazil, presented a French passport to a CBP official, but the system determined that the man's face didn't match the photo on the passport. Officials later discovered that the man was concealing his authentic Republic of Congo identification card in his shoe.
AN ETHICAL DIVIDE. Depending on which side of the facial recognition debate you land on, you probably have pretty different feelings about that.
For proponents, catching the traveler shows that a technology purportedly designed to keep us safe is doing its job.
“Terrorists and criminals continually look for creative methods to enter the U.S. including using stolen genuine documents,” said Casey Durst, CBP's Director of the Baltimore Field Office, in a news release. "The new facial recognition technology virtually eliminates the ability for someone to use a genuine document that was issued to someone else.”
BIG BROTHER, AIRPORT EDITION. However, showing that facial recognition works could worry those who see it as a stepping stone along the path to a Big Brother-esque dystopia. If it works at IAD, other airports might decide to implement their own systems, and it's not hard to imagine other locations — jobs, schools, public spaces — following suit.
This is problematic for a couple of reasons. According to an NBC report, officials say the airport system is 99 percent accurate, but other tests of facial recognition technology have revealed that the systems aren't always as accurate as advertised. Furthermore, they often include racial and gender biases, producing higher error rates for people of color and women than for white men. Sure, the IAD system can help officials identify imposters, but how many other law-abiding travelers might it falsely flag?
Catching one person with fake papers doesn't prove that a facial recognition system works, of course, and the technology is still being tested in a pilot program. But if the technology works as it's supposed to, this could be the first of many cases in which travelers that might have otherwise made it past human immigration officers are caught at the border.
READ MORE: New Facial Recognition Tech Catches First Impostor at D.C. Airport [NBC News]
More on facial recognition: Amazon Rekognition Falsely Matched 28 Members of Congress to Mugshots