"When the scammer calls you, he'll sound just like your loved one."

Clone Con

Anecdotal accounts were already abound of people losing money to scammers cloning the voices of their relatives. Now, it's apparently become enough of a prevalent — and serious — issue that federal regulators feel the need to step in.

On Monday, the US Federal Trade Commission (FTC) published a consumer alert on emerging voice cloning scams, warning people that their desperate friend or relative on the other end of the phone asking for money may actually be an AI simulacrum of their voice wielded by a scammer.

"All [a scammer] needs is a short audio clip of your family member's voice — which he could get from content posted online — and a voice-cloning program," the FTC wrote.

"When the scammer calls you, he'll sound just like your loved one."

Grandma Grifter

On Futurism, we've talked about how these voice cloning scams have recently targeted senior citizens. In one case, the ostensible voice of a grandkid asked their grandma for bail money.

In another scam, a caller claimed to be a lawyer representing an elderly couple's son, who then allowed the son to "speak" to the parents, asking them to send BitCoin to cover legal fees.

As you might've guessed, both proved to be nothing more than nefarious voice cloning.

Geriatrics in Canada appear to be a particularly vulnerable demographic, with CBC news reporting that at least eight senior citizens were targeted by these voice cloning scams in just a three day window, making away with a total $200,000 CAD.

No One's Safe

Voice cloning scans have existed for years now, used in several high profile, corporate swindles.

But it's only until recently that these scams have begun to target the average Joe, thanks to the rise of cheap and easy to use generative AIs available to consumers.

"Before, it required a sophisticated operation," Subbarao Kambhampati, a professor of computer science at Arizona State University and an expert in AI, told NPR. "Now small-time crooks can use it."

Notably, an AI voice synthesis tool provided by ElevenLabs has come under the spotlight, both for its ability to impersonate presidents and celebrities, and to break into bank accounts. Using it requires little to no technical expertise, making it easier than ever to clone someone's voice as long as a scammer has samples of it, which can easily be gleaned from social media.

To protect yourself from these scams, the FTC recommends reaching out to the supposed relative by using a phone number you know belongs to them. And watch out if their voice asks you to send money through hard to trace means, like cryptocurrency, gift cards, and wire transfers.

More on voice cloning: Voice Actors Enraged By Companies Stealing Their Voices With AI


Share This Article