Getty / Futurism
Layers of Hell

Predators Are AI-Generating "Really Evil" Child Sex Abuse Images, Experts Warn

byMaggie Harrison Dupré
6.21.23, 4:31 PM EDT
Getty / Futurism

It's an absolute nightmare.

Virtual CSAM

The reality of an AI-driven internet continues to get clearer. And that reality, in some cases, is both disturbing and destructive.

According to a report from The Washington Post, experts are finding that AI image generators are being used to create and share troves of child sexual abuse material (CSAM).

Worse yet, the spread of this material may ultimately make it harder for law enforcement to help victims.

"Children's images, including the content of known victims, are being repurposed for this really evil output," Rebecca Portnoff, the director of data science at the nonprofit child-safety group Thorn, told the WaPo, adding that the group has seen a month-over-month growth in AI-generated CSAM since last fall.

"Victim identification is already a needle in a haystack problem, where law enforcement is trying to find a child in harm's way," Portnoff continued. "The ease of using these tools is a significant shift, as well as the realism. It just makes everything more of a challenge."

Needle in a Haystack

According to the report, most predators appear to be using open-source image generators such as Stability AI's Stable Diffusion model to create the disturbing imagery.

While Stable Diffusion does have a few built-in safety precautions, including a CSAM filter, with the right know-how and a few lines of code, those filters can handily be dismantled, according to the report.

Identifying these images could prove difficult. Existing systems to stop CSAM were built to detect the proliferation of known images, not newly generated ones.

Layers of Hell

The news is alarming on several levels. It's troubling to see society's worst actors, who have reportedly been sharing tips on how to create this content with one another on forums, grasp onto the burgeoning tech as a means to engage in abusive and illegal fantasies.

This latest revelation also fits into a broader and growing trend of AI-abetted sexual abuse.

It's a worrisome example of how synthetic content stands to cause real-world harm. And in this case, that harm is wrought on society's most vulnerable.

More on AI and sexual abuse: There's a Problem With That AI Portrait App: It Can Undress People Without Their Consent


Share This Article