Extremely Disturbing
Civitai, one of the internet's largest AI platforms, is incentivizing users to make deepfakes of real people, a disturbing report from 404 Media has revealed.
Per 404, the AI model marketplace — which has reportedly received millions in funding from Andreessen Horowitz's a16z fund — recently introduced a "bounties" feature, which is a monetized system in which various users can compete to make LoRA image models in exchange for digital currency. Basically, a user posts the "bounty," or a call for an AI model that can generate hyperspecific images. Other Civitai users attempt to build the requested model, and the original bounty poster chooses a winner. That winner gets paid in something called "Buzz," virtual cash that can be purchased from Civitai with real money.
Civitai is already known as a platform where nonconsensual pornographic deepfakes are easily created and disseminated. It sadly comes as no surprise, then, that the bulk of these bountied creations appear to be nonconsensual pornographic imagery, almost entirely of women.
According to 404's reporting, this includes unwanted deepfakes of public figures like celebrities and influencers, as well as at least one person with no semblance of a public presence — the bounty posted, according to 404, provided just a handful of photos taken from the person's social media accounts. It's a troubling sign of the times, given the availability of the tech, the brazenness of its misuse, and the readiness of major financial bodies to fund a platform like this regardless.
The Lowest Bar
To be clear: nonconsensual deepfaked porn of anyone, be it of a celeb or a very regular individual, is wrong and disturbing. But the fact that regular, non-public people are at risk is alarming in its own right; indeed, in a burgeoning internet era dominated by easy-to-use generative AI tools, it increasingly feels like everyone's control over their public image is quickly slipping away.
Worse, there are currently few avenues for victim retribution. Civitai, which has confusing rules around sexual imagery to begin with, has stated that its bounties tool shouldn't be used for deepfaked pornography purposes. But it doesn't exactly seem like Civitai is expending too many enforcement resources, and though victims can report this kind of material themselves, that's setting the bar incredibly low for the platforms that host — and, in this case, monetize — that content, not to mention the funding bodies powering the platform in the background.
"I am very afraid of what this can become," Michele Alves, an Instagram influencer with an unwanted Civitai bounty, told 404. "I don't know what measures I could take, since the internet seems like a place out of control."
More on nonconsensual AI-generated porn: Google Is Sending Users Straight to Nonconsensual Deepfake Porn
Share This Article