Lawmakers on both sides of the aisle are calling for stronger laws on nonconsensual AI-generated pornography after superstar Taylor Swift became the disgusting practice's latest victim.

"Intimate deepfake images like those targeting Taylor Swift are disturbing, and sadly, they’re becoming more and more pervasive across the internet," Rep. Joe Morelle, a New York Democrat who earlier this month introduced a bill that would outlaw nonconsensual AI-generated porn, told Futurism in a statement. "I’m appalled this type of sexual exploitation isn’t a federal crime—which is why I introduced legislation to finally make it one."

Though Swift is now the most famous person to be targeted by these repulsive faux-smut peddlers, who proceeded to spread the images far and wide online this past week, she's far from the only one.

Indeed, in a press release announcing the bill, Morelle cited the story of Francesa Mani, a 14-year-old New Jersey girl who was, along with other classmates, the victim of this kind of gross image manipulation last fall.

"Just because I'm a teenager doesn't mean my voice isn't powerful," Mani said in the congressman's announcement. "Staying silent? Not an option. We are given voices to challenge, to speak up against the injustices we face. What happened to me and my classmates was not cool, and there's no way I'm just going to shrug and let it slide."

While the NJ teen's case is particularly appalling because of her age, the harm these kinds of AI-spoofed images cause doesn't end when girls turn 18.

"Try to imagine the horror," Morelle said, "of receiving intimate images looking exactly like you — or your daughter, or your wife, or your sister — and you can’t prove it’s not."

The new bill, titled "Preventing Deepfakes of Intimate Images Act," will if passed "finally make this dangerous practice illegal and hold perpetrators accountable," Morelle said.

As a representative for the New York congressman told Futurism, the bill is being co-sponsored by Rep. Tom Kean, a New Jersey Republican, and that it currently has 20 co-sponsors and counting from both parties.

While this isn't the first attempt to regulate so-called "deepfake" imagery, or images or videos that are made to look like a specific person saying or doing things that didn't actually happen, this one hits differently because of the sexual exploitation involved in the degrading phenomenon of deepfaked porn.

"The images may be fake, but their impacts are very real," the New York congressman said. "Deepfakes are happening every day to women everywhere in our increasingly digital world, and it’s time to put a stop to them. My legislation would be a critical step forward in regulating AI and ensuring there are consequences for those who create deepfake images."

More on AI abuse: George Carlin Estate Sues Podcasters Over AI “Comedy Special"


Share This Article