And the victim of the deepfake was almost always a female actor or musician.

Fake Nudes

On Monday, cybersecurity firm Deeptrace released "The State of Deepfakes," a report that takes a deep dive into the AI-doctored videos many say threaten to destroy democracy.

But by analyzing all 14,678 deepfakes it could find online, the company discovered that the people creating the clips seem far less concerned with manipulating elections — and really, really into the idea of inserting actresses and musicians into porn.

Face Swap

According to the report, 96 percent of the manipulated videos fell into the category of "non-consensual deepfake pornography" — in other words, the deepfakes' creators replaced the face of an actress in a pornographic video with that of another woman.

Usually, the targets are celebrities. The report noted that in all but 1 percent of those porn deepfakes, the woman taking the place of the original actress was a female actor or musician.

Money Shot

The deepfake pornography videos aren't just strewn about the internet, either. The report's authors discovered a whole "established ecosystem" of sites hosting the clips — and monetizing them.

"The fact that those websites all contained advertising and there was a clear financial or business incentive for running these websites is also very important to recognize," lead author Henry Ajder told Fortune, "because it shows they aren’t going away any time soon."

READ MORE: Most Deepfakes Are Used for Creating Non-Consensual Porn, Not Fake News [Vice]

More on deepfakes: Deepfake Pioneer: “Perfectly Real” Fake Vids Are Six Months Away


Share This Article