The News Today
__
12.12.19
Gender Woes

Google: We Weren’t Being Sexist Giving Assistant a Female Voice

September 19th 19__Kristin Houser__Filed Under: Artificial Intelligence
Google/Victor Tangermann

Ladies First

In May, the United Nations released a troubling report, arguing that female-sounding voices for AI assistants such as Apple’s Siri and Amazon’s Alexa perpetuate gender biases and encourage users to be sexist.

Now, Google has come out to explain why it chose to give its Assistant a female-sounding voice — and the search giant says it has nothing to do with gender biases and everything to do with the available technology.

Tech Troubles

According to Google Assistant product manager Brant Ward, Google initially planned to launch Assistant with a male voice. The problem, he told Business Insider, was that the audio produced by text-to-speech systems was easier to understand if delivered in a higher-pitched, female-sounding voice.

“At the time, the guidance was the technology performed better for a female voice,” Ward said. “The [text-to-speech] systems in the early days kind of echoed [early systems] and just became kind of more tuned to female voices.”

The Voice

Thankfully, text-to-speech tech has come a long way since then, meaning a wider range of voices for Assistant.

In fact, Google just added new voices for the AI in nine languages on Wednesday, using colors and not names to differentiate between them. English-speaking users, meanwhile, can already choose from between 11 different voices for Assistant — including that of male crooner John Legend.

READ MORE: A UN study suggested it’s sexist for voice assistants like Siri or Alexa to have female voices — but Google says it wanted to use a male voice from the very beginning [Business Insider]

More on gender bias: United Nations: Siri and Alexa Are Encouraging Misogyny