Veritone / Futurism
Self-Automation

Startup Lets Checked-Out Influencers Deepfake Themselves for Product Endorsements

byDan Robitzski
May 14
Veritone / Futurism

Why bother recording a commercial or cameo when a deepfake can do it for you?

Influencer Automation

A startup called Veritone is launching a new AI platform that will let celebrities, influencers, and other prominent figures create, control, and license deepfakes of their own voice.

The idea is that these self-deepfakers may want to leverage their celebrity status to make more money recording commercials, endorsements, or any other monetizable audio but simply don’t have the time. But with Veritone’s Marvel.AI, they can license out the sound of their own voice without ever entering a studio.

“People want to do these deals but they don’t have enough time to go into a studio and produce the content,” Veritone president Ryan Steelberg told The Verge. “Digital influencers, athletes, celebrities, and actors: This is a huge asset that’s part of their brand.”

Reclaiming Clones

It sounds mildly silly to approach advertisers and say “hey, we can’t get you this celebrity, but how about the next best thing: artificial intelligence mimicking them as best as it can?”

Advertisement

However, if deepfakes are going to exist at all, it’s nice to at least see them created and used on the subject’s own terms rather than as a form of misinformation or exploitation, as all-too-commonly happens with the AI mimicry technology. And, as the recordings that Steelberg shared with The Verge show, the technology does seem to work decently well.

Transparently Deceptive

The problem with deepfakes, even when used for legitimate purposes, is that there’s a degree of deception inherent to every synthesized recording.

When it comes to advertising, Steelberg told The Verge that he’s worried that might cheapen the synthesized sound bites or leave a bitter taste in the audience’s mouth when they learn that they’re only hearing an imitation. So by working in the advertising space, he’s trying to create a new standardized signal, perhaps a tone that plays before an audio deepfake, to let everyone know what’s going on.

“It’s not just about avoiding the negative connotations of tricking the consumer,” Steelberg said, “but also wanting them to be confident that [this or that celebrity] really approved this synthetic content.”

Advertisement

READ MORE: Veritone launches new platform to let celebrities and influencers clone their voice with AI [The Verge]

More on deepfake tech: Professor Warns of “Nightmare” Bots That Prey on Vulnerable People


As a Futurism reader, we invite you join the Singularity Global Community, our parent company’s forum to discuss futuristic science & technology with like-minded people from all over the world. It’s free to join, sign up now!

Share This Article

Copyright ©, Singularity Education Group All Rights Reserved. See our User Agreement, Privacy Policy and Cookie Statement. The material on this site may not be reproduced, distributed, transmitted, cached or otherwise used, except with prior written permission of Futurism. Fonts by Typekit and Monotype.