DMV’s AI System Says Woman Doesn’t Have a Human Face

Joe Wilkins Avatar
AI deployments have a problem: they can humiliate people who don't look the way the AI expects them to look.
Illustration by Tag Hartman-Simkins / Futurism. Source: Getty Images

When it comes to inclusion, artificial intelligence doesn’t have the best track record. As vast algorithms trained on troves of data scraped from the internet, AI models are inherently predisposed to reproducing human social biases.

It’s no surprise, then, that AI has a real knack for discrimination, often exacerbating prejudice on the basis of race, gender, and sexuality. And as one woman discovered during a demeaning trip to the DMV, AI systems are also quite capable of discriminating against those with disabilities.

In a story first reported by Wired, Connecticut native Autumn Gardiner’s task seemed easy, mundane — she had recently gotten married, and was at the DMV to update her license. To do so, officials needed to take a new photo, a simple process which quickly turned into a nightmare thanks to the state government’s AI-powered ID verification program.

Gardiner, who lives with Freeman-Sheldon syndrome — a rare genetic disorder affecting muscles around the face, particularly the mouth — says that one by one, her photos were all rejected by the DMV’s ID software. It became a spectacle, she told Wired. “Everyone’s watching,” she said. They’re taking more photos.”

“It was humiliating and weird,” she continued. “Here’s this machine telling me that I don’t have a human face.”

Freeman-Sheldon causes what’s referred to as a visible difference. While there is no authoritative list defining what is an isn’t a visible difference, the advocacy group Changing Faces describes it as “a scar, mark, or condition that makes you look different.” This can include anyone with birthmarks, burns, cancer, craniofacial conditions, hair loss, skin conditions like vitiligo, or inherited conditions like neurofibromatosis.

Around half a dozen people with visible differences spoke to Wired to chronicle the ways AI software is increasingly complicating their lives. The frustrations are endless, ranging from social media selfie filters to facial verification for banking apps.

“In many countries, facial recognition is increasingly a part of everyday life, but this technology is failing our community,” Nikki Lilly, a representative of the group Face Equality International testified in front of the United Nations earlier this year.

As more and more part of life become locked away behind these systems, it’s a serious question: who benefits from it, and whose life is made more challenging?

More on facial recognition: Police Use Busted Facial Recognition System, Arrest Random Man and Accuse Him of Horrible Crime

Joe Wilkins Avatar

Joe Wilkins

Contributing Writer

I’m a tech and transit correspondent for Futurism, where my beat includes transportation, infrastructure, and the role of emerging technologies in governance, surveillance, and labor.


TAGS IN THIS STORY