Future robocallers might sound just like your friends and family.

Call Hating

Robocalls are on the rise, with nearly a third of all phone calls now made by automated dialing machines programmed to play a prerecorded message if someone answers.

Now, experts are predicting that the annoying calls are destined for a high-tech — and highly disturbing — next level of evolution: scammers using voice-mimicking AIs to make their pre-recorded messages sound like your friends and family.

Who's This?

In a story published on Saturday, academic Tarun Wadhwa told CNN that scammers could one day use the swaths of data available online to find out who a person is close to and then use that information to create robocall messages that sound like someone the target knows.

"It's going to be like Photoshop — something so easy, widespread, and well known that we stop tracking how it's being used against people personally and don't find it surprising," Wadhwa said, adding that he could "easily imagine situations in which these sorts of voice-mimicry technologies are used to sow confusion, extort people, and make fraud and scams far more precise."

The Hang Up

According to Alex Quilici, CEO of robocall-prevention app YouMailIf, we thankfully don't have to worry about robocallers pretending to be our grandmothers just yet.

"Building a fake computer voice right now is a decent amount of work," he told CNN. "If I wanted to build one that sounded like you, for example, I'd need to get a ton of samples of you saying specific phonemes, and train a computer model on that."

READ MORE: The frightening future of robocalls: Numbers and voices you know [CNN]

More on robocalls: Rise of the Robocallers: Here’s How We’ll Avoid a Future of Scammers


Share This Article