Will this be enough to keep child abuse off Pornhub?

No Consent

In an effort to combat non-consensual content — up to and including sexual videos of minors  — on its platform, Pornhub is making sweeping changes to to its operations, Vice reports.

The website will ban downloads of any content, restrict uploads only to verified users, and expand its moderation process. That’s a stark departure from Pornhub’s previous policy of allowing any user to upload content without verification.

Content Partners

The news comes after a damning opinion piece published by The New York Times described a much darker side to a smut site that has often prided itself on making charitable donations.

“Effective immediately, only content partners and people within the Model Program will be able to upload content to Pornhub,” a statement reads.

The platform is also hoping that new “fingerprinting technology” will block any uploads of content that had already been hosted on and flagged by the site.

Red Team

A dedicated “Red Team” will also be established at Pornhub, with the single task of “proactively sweeping content already uploaded for potential violations and identifying any breakdowns in the moderation process that could allow a piece of content that violates the Terms of Service.”

The company will also use Google’s AI tool Content Safety API to help detect illegal imagery.

Pornhub is also establishing a “Trusted Flagger Program” that will allow 40 non-profit child safety organizations to “have a direct line of access to our moderation team.”

READ MORE: In a Huge Policy Shift, Pornhub Bans Unverified Uploads [Vice]

More on content moderation: Peloton — Yes, the Exercise Bike — Is Fighting Hate Speech on Its Platform


Share This Article