United Nations: Siri and Alexa Are Encouraging Misogyny

Today's voice assistants are facing their own "Me Too" moment.

5. 23. 19 by Kristin Houser
Victor Tangermann
Image by Victor Tangermann

Two-Way Street

We already knew humans could make biased AIs — but the United Nations says the reverse is true as well.

Millions of people talk to AI voice assistants, such as Apple’s Siri and Amazon’s Alexa. When those assistants talk back, they do so in female-sounding voices, and a new UN report argues that those voices and the words they’re programmed to say amplify gender biases and encourage users to be sexist — but it’s not too late to change course.

Gender Abuse

The report is the work of the United Nations Educational, Scientific, and Cultural Organization (UNESCO), and its title — “I’d blush if I could” — is the response Siri was programmed in 2011 to give if a user called her a “bitch.”

According to UNESCO, that programming exemplifies the problems with today’s AI assistants.

Advertisement

“Siri’s submissiveness in the face of gender abuse — and the servility expressed by so many other digital assistants projected as young women — provides a powerful illustration of gender biases coded into technology products,” the report’s authors wrote.

It was only after UNESCO shared a draft of its report with Apple in April 2019 that the company changed Siri’s response to “I don’t know how to respond to that.”

“Me Too” Moment

The fact that Apple was willing to make the change is encouraging, but that’s just one phrase uttered by one assistant. According to UNESCO’s report, to truly make a difference, the tech industry will need to enact much more comprehensive changes.

A good starting place, according to the authors, would be for tech companies to hire more female programmers and stop making their assistants female by default, instead opting for gender-neutral voices.

Advertisement

“It is a ‘Me Too’ moment,” Saniye Gülser Corat, Director of UNESCO’s Division for Gender Equality, told CBS News. “We have to make sure that the AI we produce and that we use does pay attention to gender equality.”

READ MORE: Is it time for Alexa and Siri to have a “MeToo moment”? [CBS News]

More on biased AI: A New Algorithm Trains AI to Erase Its Biases


Care about supporting clean energy adoption? Find out how much money (and planet!) you could save by switching to solar power at UnderstandSolar.com. By signing up through this link, Futurism.com may receive a small commission.

Advertisement

Share This Article

Keep up.
Subscribe to our daily newsletter to keep in touch with the subjects shaping our future.
I understand and agree that registration on or use of this site constitutes agreement to its User Agreement and Privacy Policy

Advertisement

Copyright ©, Camden Media Inc All Rights Reserved. See our User Agreement, Privacy Policy and Data Use Policy. The material on this site may not be reproduced, distributed, transmitted, cached or otherwise used, except with prior written permission of Futurism. Fonts by Typekit and Monotype.