He transferred the scammer a whopping $622,000.

Skin Walker

Cops say a scammer in northern China has managed to cheat a man out of his money, Reuters reports, by masquerading as his friend with the use of AI-powered face-swapping and voice-cloning software.

If the tale holds up to scrutiny, the incident would go to show how easy it has become for scammers to deepfake their way to a payday using new AI tools.

While we've come across examples of scammers cloning their voices to extort money out of their victims over the phone, this new incident offers a glimpse of an uneasy future in which a perpetrator can even take on the physical appearance of a victim's friend. As such, it raises an old specter of modernity: is technology drawing us closer, or tearing us apart?

Scam Link

According to law enforcement in the city of Baotou, in Inner Mongolia, the scammer managed to convince a friend to transfer a whopping $622,000 to their account during a video call, claiming it was for a deposit during a bidding process.

Shortly after, the victim contacted his real friend, who had no idea about the conversation.

Fortunately, the victim was able to recover most of the stolen funds, according to Reuters, and is in the process of tracking down the rest.

AI-powered scams are on the rise worldwide, with the US Federal Trade Commission (FTC) publishing a consumer alert on emerging voice cloning scams earlier this year. Seniors in particular are increasingly becoming targets of scammers who use voice-cloning tech to impersonate relatives, saying they need money for emergencies.

Last month, a mom even got a call from the purported kidnapper of her daughter, whose voice appeared to be being cloned in the background.

Griftpocalypse Now

It's a troubling new trend worldwide. Users on the Chinese social media network Weibo expressed worry about AI scams "exploding across the country."

In response, regulators have been attempting to tamp down on fraud by increasing scrutiny of deepfake apps that can change somebody's voice and appearance, Reuters reports.

But whether these actions will allow law enforcement to stay on top of the trend remains to be seen. As the tech improves, cloned voices and faces are only going to become more believable sounding — and looking — as time goes on.

More on AI scams: Mom Says Creeps Used AI to Fake Daughter's Kidnapping


Share This Article