All hail the security question!

Sotto Voce

A scammer attempting to trick Ferrari executives was thwarted by a stalwart safety measure: common sense.

As Bloomberg reports, the scammer in question earlier this month reached out to a Ferrari executive via WhatsApp. From an account with the name and profile of Ferrari CEO Benedetto Vigna — though not from the real CEO's usual number, of course — they attempted to convince the C-Suiter that a major acquisition was soon to be underway.

"Hey, did you hear about the big acquisition we're planning? I could need your help," the Vigna impersonator wrote, adding that they should be ready to sign a non-disclosure agreement ahead of the big deal.

"Italy's market regulator and Milan stock exchange have been already informed," the scammer continued. "Stay ready and please utmost discretion."

That's when things got even shadier, according to Bloomberg. The unnamed executive then jumped on the phone with the phony Vigna, who used a deepfaked voice to "speak" in a live conversation with the targeted scam-ee. In Vigna's voice, the scammer fibbed that the CEO was using a super-secret number to conduct super-secret business, hence the strange messages from the new number. But still not quite convinced, the executive asked the scammer a question that only the real Vigna would know: what book did the exec just lend to his high-powered boss?

And with that, the scammer hung up the phone. The enduring security question tactic stands strong!

Scamly Reunion

AI-powered deepfakes of human voices and even faces are getting increasingly convincing, meaning that it's getting increasingly difficult to rely on simply trying to dissect whether someone looks or sounds real. So, with deepfaked-abetted scam sprees on the rise, the Ferrari executive's time-tested strategy is a great example of a way to defend yourself — yes, this stuff can happen to anyone, not just executives of billion-dollar Italian companies — against a similar scam.

In the same vein, security experts have encouraged families to come up with secret "code words" to use with each other in the case that, say, a scammer impersonates a family member in a ploy to abscond with some cash.

While the attempted Ferrari scammer was foiled, other corporate executives should remain vigilant. Earlier this year, a CEO was fooled into handing over tens of millions of dollars to deepfake-assisted scammers. And in an incredible twist of irony, the security firm Know Be4 recently revealed that it was tricked into hiring a North Korean hacker as a remote worker, who used an AI-generated headshot to conceal their identity.

In other words, scammers stay scamming, and they're using AI to do it. Stay safe out there, kids!

More on AI scams: Security Firm Alarmed to Discover Their Remote Employee Is a North Korean Hacker


Share This Article