Pervert Platform

On Sunday, YouTube personality Matt Watson posted a video to the site exposing what he called a "soft-core pedophilia ring." Based on his research, pedophiles are congregating on suggestive videos of children — and, once connected, using the comments sections to share contact info and links to child pornography.

Watson's video quickly generated more than 2 million views, and major brands including Disney, Nestle, and Epic Games rushed to pull their advertisements from YouTube — the latest shocking example of how difficult it is for Silicon Valley platforms to police their own communities at scale, even in the face of egregious abuse.

Nefarious Network

Many of the videos drawing these YouTube pedophiles don't violate any of the platform's policies — they might feature young girls playing games or doing gymnastics. Others feature illegal content, such as one video in which a prepubescent girl flashes her bare pubic area.

The comments sections of the videos feature lewd messages, suggestive emoji, and timestamps to the moments most likely to appeal to pedophiles. Some commenters leave questions for the children in the videos, and horrifically, the children occasionally respond.

Watson claims in his video that YouTube's algorithms facilitate this network of pedophiles. Watch one video of children innocently playing, and the algorithm will suggest videos that, from their comments sections, appear popular with YouTube pedophiles.

YouTube's search function plays a role as well — a WIRED story noted that the site's autocomplete will add the words "young" and "hot" if a user searches the phrase "girl yoga."

Disturbed Déjà Vu

Since the release of Watson's video, YouTube has shared the following statement with various media outlets: "Any content — including comments — that endangers minors is abhorrent and we have clear policies prohibiting this on YouTube. We took immediate action by deleting accounts and channels, reporting illegal activity to authorities and disabling violative comments."

A number of brands have pulled their advertising from YouTube, too, issuing statements denouncing child exploitation. And that's great. But advertisers have done this exact same thing before, and yet the problem persists.

That's due at least in part to how incredibly tough this problem will be to solve. Every minute, users upload 450 hours of content to YouTube, and the combination of humans and machine learning tech the company currently uses to review this content clearly isn't capable of preventing illegal content from slipping through.

A major loss of advertisers over a long period of time could spur YouTube to find a better system for filtering out this content — assuming one exists — but government fines for facilitating the spread of child pornography might provide additional incentive.

And what about the content that doesn't violate any rules, like videos of children playing in their backyards? Does YouTube ban any videos featuring minors? How might it enforce that? Should it disable its autocomplete search function so that problematic videos are more difficult for YouTube pedophiles to find?

These are deep questions, but the platform has been dealing with the issue of child exploitation as far back as 2013. Clearly, the various policies it's already enacted to address the problem aren't enough — and the strongly worded statements about how "abhorrent" it finds this content won't be enough either.

READ MORE: Advertisers Boycott YouTube After Pedophiles Swarm Comments on Videos of Children [The New York Times]

More on YouTube: New Research: YouTube Caused the “Flat Earther” Epidemic


Share This Article