"At the end of the day, if they’re not real, who really cares?"

AI Nudes

Folks have begun using artificial intelligence software to generate explicit images of people who don't exist — and, believe it or not, it seems to be a seller's market.

"The average person who’s looking at this stuff, I don’t think they care," one such creator, who's sharing explicit images of adult women wearing diapers, told the Washington Post. "I don’t expect the person I’m looking at online to be the person they say they are. I’m not going to meet this person in real life."

"At the end of the day, if they’re not real, who really cares?" the creator added.

While deepfake porn has long been a problematic staple in the adult industry, recent — and rapid – advancements in AI image generation have made it easier than ever to come up with believable images of people that don't exist, which, unsurprisingly, has been a boon for those looking to make money from selling "nudes."

Say It AI'nt So

As the report notes, Reddit is home to a lot of purported adult models whose images, according to AI experts and the people behind them, show telltale signs of being AI-generated, such as birthmarks that are present in some photos and not in others.

While some creators think the people that consume their AI-generated porn don't mind that it's fake, others who interacted with likely AI porn avatars sang a different tune.

One Reddit user told the WP said they "feel a bit cheated" after learning from reporters that Claudia, the fake porn model in question, was probably not who she was pretending to be.

They Admit It

As Rolling Stone confirms, the Claudia account is run by two computer science students who started the project as a joke. The pair created the avatar using the AI image generator Stable Diffusion, and said they made about $100 from unsuspecting users on subreddits like r/NormalNudes and r/AmIHot before getting rightfully called out for AI catfishing.

"You could say this whole account is just a test to see if you can fool people with AI pictures," the students, who declined to give their real names, told Rolling Stone. "We honestly didn’t think it would get this much traction."

Expectations aside, these inventive students have unknowingly waded into some pretty murky territory — and there's no doubt that things are only going to get more confusing going forward.

More on image generation: The Company Behind Stable Diffusion Appears to Be At Risk of Going Under


Share This Article