The Wall Street Journal called it “the worst job in technology” in 2017.
Content moderators at Facebook have the gruesome job of weeding through hundreds of videos of violent murders, hate speech, and even suicides — and that’s bound to take a heavy toll.
On Friday, two former Facebook content moderators signed on to a lawsuit in a California superior court, alleging that they also suffered from symptoms of post-traumatic stress disorder and psychological trauma, CNET reports.
The original lawsuit dates back to September, stating that contractors have to view thousands of “videos, images and live-streamed broadcasts of child sexual abuse, rape, torture, bestiality, beheadings, suicide and murder” every day, according to an official press release — and Facebook is not doing enough to protect them.
“This case has uncovered a nightmare world that most of us did not know about,” Steve Williams, a lawyer for the firm representing the content moderators, said in a statement, as quoted by CNET. “The fact that Facebook does not seem to want to take responsibility, but rather treats these human beings as disposable, should scare all of us.”
Facebook has some 15,000 content reviewers, all of whom don’t actually work directly for Facebook, but have signed contracts with third parties like Accenture and Cognizant.
Friday’s news comes after The Verge reported on the horrible and traumatic working conditions for content moderators at the social media company.
“Part of the reason I left was how unsafe I felt in my own home and my own skin,” an unnamed employee told The Verge, adding that they started carrying a gun to protect themselves after being accosted by other employees.
Others resorted to doing drugs or even having sex as a way to cope with the trauma. “I can’t even tell you how many people I’ve smoked with,” one employee told The Verge.
In a November 2018 court filing, Facebook argued that the original lawsuit filed in September should be dismissed.
Bloomberg reported this week that Facebook is working with Accenture, a staffing firm that employs many of Facebook’s content moderators, to ensure that their practices comply with Facebook’s policies.
Messages circulating via internal message boards tried to dispel concerns over the abuse. In a post onFeb 25, Justin Osofsky, VP of Global Operations, wrote: “We’ve done a lot of work in this area and there’s a lot we still need to do.”
“After a couple of years of very rapid growth, we’re now further upgrading our work in this area to continue to operate effectively and improve at this size,” he added.
But whether Facebook’s actions will be enough is still uncertain.
READ MORE: Facebook faces complaints from more former content moderators in lawsuit [CNET]
More on content moderators: Facebook Mods Are so Traumatized They’re Getting High at Work