So Gross

True Crime Ghouls Are Using AI to Resurrect Murdered Children

Seriously, do not watch any of these videos.
Noor Al-Sibai Avatar
There's a burgeoning TikTok true crime subgenre that uses AI to digitally resurrect child murder victims in disturbing tell-all videos.
Laptop, black man and phone call shocked in office, talking or conversation after reading online shocking news. Shock, surprise and businessman on 5g mobile smartphone tech with business deal email. Image: Getty Images

Ghoul Kids Club

There is, apparently, a TikTok subgenre of the already-problematic true crime fandom that’s using artificial intelligence to digitally resurrect the victims of heinous crimes and have them tell the stories of how these real-life children were killed.

As Rolling Stone reports, most of these accounts change the appearance — and sometimes the names — of the actual victims when using AI to digitally resurrect them, likely as a means to get around TikTok’s recent rule banning “deepfake” depictions of young people and requiring all use of the tech be labeled as such.

Still, the AI-generated characters purport to tell the story of terrible crimes that happened to actual children, although they often tweak key details — an indistinguishable morass of fact and fiction, in other words, that leaves us wondering what, exactly, the purpose of these videos really is.

“They’re quite strange and creepy,” Paul Bleakley, an assistant professor of criminal justice at the University of New Haven, told Rolling Stone. “They seem designed to trigger strong emotional reactions, because it’s the surest-fire way to get clicks and likes. It’s uncomfortable to watch, but I think that might be the point.”

Literally Why

While some of these accounts do provide disclaimers saying that they don’t use real images out of respect for the dead, the effect when watching them is nonetheless the same: very uncanny, very unsettling, and that they were presumably made without the consent of the victims’ families.

“Imagine being the parent or relative of one of these kids in these AI videos,” Bleakley told Rolling Stone. “You go online and in this strange high-pitched voice, here’s an AI image [of] your deceased child, going into very gory detail about what happened to them.”

While there is, of course, a legal aspect to this — the professor compared it to new laws cropping up that ban deepfake porn, which he notes is a “very sticky, murky gray area” — on an emotional and ethical level, the whole endeavor feels unconscionable.

There is, however, one small silver lining: both the accounts mentioned by Rolling Stone have now been taken down, so maybe TikTok’s in-house content moderation is working after all.

More on the creepy side of AI: Programmer Creates Grim Tool to Clone Anyone as an “AI Girlfriend”

Noor Al-Sibai Avatar

Noor Al-Sibai

Senior Staff Writer

I’m a senior staff writer at Futurism, where my work covers medicine, artificial intelligence and its impact on media and society, NASA and the private space sector.