It's not just visual artists who are feeling the heat of AI's encroachment — now, professional voice actors are beginning to be affected, too.

Last week, Vice reported on a troubling trend that's gaining traction in the voice acting industry: actors being "asked" — sometimes not very honestly — to sign contracts that would allow their clients to synthesize their voices using AI, enabling them to wield an actor's voice for as long as they want, to say what they want, and often without any additional compensation.

Another distressing aspect of these contracts? The AI clauses in them tend to be deceptively embedded.

"The language can be confusing and ambiguous," Tim Friedlander, president of the National Association of Voice Actors, told Vice, describing the practice as "very prevalent."

"Many voice actors may have signed a contract without realizing language like this had been added ... Some actors are being told they cannot be hired without agreeing to these clauses," he noted.

In the wake of the news, many voice acting heavyweights began to chime in on the use of AI to mimic their voices, Gizmodo spotted. Some had only just discovered that their voices were being synthesized on AI apps and websites without their permission (though they have not disclosed which specific platforms).

"Hey friends, I know AI technology is exciting, but if you see my voice or any of the characters that I voice offered on any of those sites, please know that I have not given my permission, and never will," wrote Steve Blum, the iconic gravelly voice behind Spike Spiegel from the hit anime series "Cowboy Bebop," in a tweet on Friday.

"This is highly unethical," he added.

Many other notable voice actors, like Matthew Mercer, Stephanie Sheh, and Cristina Vee, echoed Blum's sentiment.

"I know people have been using AI of my voice to have fun, make my characters cuss or do other out of pocket things, etc," Vee tweeted. "This is all done without my consent and it does feel extremely weird. If you see clips out there, please know it’s without my permission."

For now, voice actors still have a choice to sign these contracts — assuming they realize what kind of devilish deal they're getting into in the first place.

But it's undeniable that if this becomes a common practice, the pressure to sign these contracts will soon be overwhelming if it's a choice between getting paid work and nothing at all.

The monetary implications of that for voice actors are already worrying enough, but the potential for their voices to be applied to content they never consented to be a part of in the first place is even more disturbing.

"What happens when we happily agree to a role, and, once in the booth, we see a particular line in the script that doesn't feel right, and express unambiguous discomfort? " asked voice actor Sarah Elmaleh, in her comments to Vice. "What happens if the producer doesn't comprehend or accept the seriousness of that objection?"

"Normally, we are able to refuse to read the line, to prevent it from being used," Elmaleh added. "This technology obviously circumvents that entirely."

More on generative AI: David Guetta Faked Eminem’s Vocals Using AI for New Song


Share This Article