It's still actively promoting pedophilic content.
Basic Issues
Facebook's parent company Meta is seriously struggling to keep pedophiles off its social media platforms.
Worse yet, as The Wall Street Journal reports, there have been instances of Instagram and Facebook actually promoting pedophile accounts, making what sounds like an already dangerous situation even worse.
Earlier this year, the newspaper teamed up with researchers at Stanford and the University of Massachusetts Amherst and found that Instagram's algorithms were connected to an entire distribution network of underage sex content.
At the time, Meta formed a child-safety task force to address the glaring issue — but even after five long months, the company still has a lot of work to do, the WSJ found, with the company's platforms still actively promoting pedophilic content.
Unmet Expectations
And it's not just Instagram. There are still entire Facebook groups dedicated to sharing content that sexualizes children.
Despite Meta's best efforts, which included having its task force of more than 100 employees ban pedophile-related hashtags, it's still a massive problem that will require drastic changes.
The scale of the issue is also incredibly damning. The Canadian Centre for Child Protection came across Instagram accounts that had up to 10 million followers that livestreamed videos of child sex abuse.
Other Facebook groups with hundreds of thousands of users still actively celebrate incest and sex with children, per the WSJ's own investigation.
Perhaps most astonishingly, even once the newspaper flagged these groups, Meta claimed that they didn't violate any of the company's "Community Standards," despite the word "Incest" being in the name of one Facebook group.
So what's the solution? Meta said it's still refining the tools that limit the spread of pedophilic accounts and, to its credit, has finally started disabling accounts that are suspicious enough to exceed a certain threshold. The company claims it has removed 16,000 accounts that violated child safety policies since July, per the report.
But limiting the tools that algorithmically recommend content to users on Meta's platforms simply isn't in the cards, as one spokesperson told the WSJ. After all, it's likely a significant source of much of the company's revenue.
Making matters worse, the company has also been laying off huge numbers of workers, including those working on reviewing suspected inappropriate sexual content.
In short, despite being caught red-handed, Meta has seemingly done very little to address the issue — and that doesn't bode well, considering the company's ever-shrinking headcount and already struggling content moderation teams.
More on Meta: Zuckerberg's Metaverse Is Bleeding Billions of Dollars, Documents Reveal
Share This Article