We already knew humans could make biased AIs — but the United Nations says the reverse is true as well.
Millions of people talk to AI voice assistants, such as Apple’s Siri and Amazon’s Alexa. When those assistants talk back, they do so in female-sounding voices, and a new UN report argues that those voices and the words they’re programmed to say amplify gender biases and encourage users to be sexist — but it’s not too late to change course.
The report is the work of the United Nations Educational, Scientific, and Cultural Organization (UNESCO), and its title — “I’d blush if I could” — is the response Siri was programmed in 2011 to give if a user called her a “bitch.”
According to UNESCO, that programming exemplifies the problems with today’s AI assistants.
“Siri’s submissiveness in the face of gender abuse — and the servility expressed by so many other digital assistants projected as young women — provides a powerful illustration of gender biases coded into technology products,” the report’s authors wrote.
It was only after UNESCO shared a draft of its report with Apple in April 2019 that the company changed Siri’s response to “I don’t know how to respond to that.”
The fact that Apple was willing to make the change is encouraging, but that’s just one phrase uttered by one assistant. According to UNESCO’s report, to truly make a difference, the tech industry will need to enact much more comprehensive changes.
A good starting place, according to the authors, would be for tech companies to hire more female programmers and stop making their assistants female by default, instead opting for gender-neutral voices.
“It is a ‘Me Too’ moment,” Saniye Gülser Corat, Director of UNESCO’s Division for Gender Equality, told CBS News. “We have to make sure that the AI we produce and that we use does pay attention to gender equality.”
READ MORE: Is it time for Alexa and Siri to have a “MeToo moment”? [CBS News]
More on biased AI: A New Algorithm Trains AI to Erase Its Biases