Amazon’s Facial Recognition Struggles With Darker Skin

The company denied the findings, saying it's since updated Rekognition.

1. 25. 19 by Dan Robitzski
John Cameron/Joe Robles/Tag Hartman-Simkins
Image by John Cameron/Joe Robles/Tag Hartman-Simkins

We’ve Been Over This

Amazon apparently still hasn’t fixed the problems with its facial recognition software, Rekognition.

The software, which is used by police and Immigration and Customs Enforcement, has drawn criticism for frequent errors, like when it matched Congresspeople’s headshots with photos from a mugshot database. Now, new research shows that Rekognition generally fares well with white men’s faces, but it struggles to identify light-skinned women and anyone with darker skin, according to The Verge.

Gender Bias

Presented with the faces of light-skinned women, Rekognition incorrectly labeled 19 percent of them as men, according to new research from MIT and the University of Toronto. Women with dark skin were incorrectly labeled as men 31 percent of the time, according to the research, which will be presented this weekend at the AAAI/ACM Conference on Artificial Intelligence, Ethics, and Society.

Amazon responded to the research similarly to how it dismissed past criticisms of Rekognition — the company argued that the researchers hadn’t used the latest version of Rekognition, according to The Verge. Amazon also pushed back on the basis that the facial analysis system that detects facial expressions and gender is separate from the facial recognition system that police might use to compare people to a mugshot. Fair points — but not ones that address the actual problem detected by the researchers head-on.

Advertisement

Broken System

Ultimately, the researchers hope to accomplish more than encouraging surveillance systems that can accurately detect people of all races and genders — they also want to bring about a society in which such technology is properly regulated to prevent police or anyone else from abusing people.

“Furthermore, while algorithmic fairness may be approximated through reductions in subgroup error rates or other performance metrics, algorithmic justice necessitates a transformation in the development, deployment, oversight, and regulation of facial analysis technology,” the researchers wrote in their paper.

READ MORE: Gender and racial bias found in Amazon’s facial recognition technology (again) [The Verge]

More on Rekognition: Police Surveillance Is Getting a Helping Hand from… Amazon!

Advertisement


Futurism Readers: Find out how much you could save by switching to solar power at UnderstandSolar.com. By signing up through this link, Futurism.com may receive a small commission.

Share This Article

Keep up.
Subscribe to our daily newsletter to keep in touch with the subjects shaping our future.
I understand and agree that registration on or use of this site constitutes agreement to its User Agreement and Privacy Policy

Advertisement

Copyright ©, Camden Media Inc All Rights Reserved. See our User Agreement, Privacy Policy and Data Use Policy. The material on this site may not be reproduced, distributed, transmitted, cached or otherwise used, except with prior written permission of Futurism. Fonts by Typekit and Monotype.