Shutterstock/Victor Tangermann
Fatal Flaws

Researcher: Facial Recognition Tech Could Get Trans People Killed

"The worst case scenario is...they end up shot or arrested or harassed."

Kristin HouserApril 26th 2019

Deadly Deployment

Facial recognition technology is still in the nascent stages of development, and we’ve already seen many ways it can go wrong, from China using the tech to track and detain minorities to the numerous examples of it perpetuating racial and gender bias.

Now, in an expansive interview with VentureBeat, AI researcher Os Keyesat from the University of Washington has presented several “nightmare scenarios” for transgender people that could result from the deployment of facial recognition tech — and in some cases, they end with the person being killed.

Who’s There?

In the interview, Keyes noted how some apartment buildings are considering the use of facial recognition tech for entry — the idea is a resident would show their face, and the system would recognize them and unlock the door.

Keyes also pointed out how some people are even suggesting facial recognition systems be used to monitor bathrooms.

Keyes told VentureBeat that because some systems have trouble recognizing transgender or gender non-conforming people, they could be flagged, leading to law enforcement being called to the scene — and that could prove deadly for trans people, particularly those of color.

“To be exceedingly deadpan,” Keyes said, “the police’s record with trans people of color is not great, so yeah — the worst case scenario is someone tries to go to the bathroom because they just want to piss and they end up shot or arrested or harassed, or shot and then arrested and then harassed.”

No Good Use

Keyes isn’t just concerned about how facial recognition tech will affect the trans community, though. Another worry is that it doesn’t benefit any members of society enough to warrant further development.

“I would like to see facial recognition development and usage just made straight-up illegal because I don’t think this is a technology with redeeming features,” Keyes told VentureBeat. “Nobody has been able to point me to a use case that directly benefits humanity that can’t be solved with other means. It’s so obviously ripe for abuse and has already been [so] abused that it’s not worth doing.”

READ MORE: A transgender AI researcher’s nightmare scenarios for facial recognition software [VentureBeat]

More on facial recognition: Americans Built Tech for China’s Sinister “Re-Education Camps”

Keep up. Subscribe to our daily newsletter.

I understand and agree that registration on or use of this site constitutes agreement to its User Agreement and Privacy Policy
Next Article
////////////