As a content moderator for Facebook, it was literally Dublin resident Chris Gray's job to watch child abuse, animal torture, and executions — disturbing imagery that left his mental health in shambles.
"You would wake up and you’re remembering the video of someone machine-gunning people in the Middle East somewhere," he told The Guardian, "trying to think whether there was an ISIS flag, and so whether it should be marked as terrorism-related or not."
"It took me a year after I left to realize how much I’d been affected by the job," Gray continued. "I don’t sleep well, I get in stupid arguments, have trouble focusing."
A doctor diagnosed Gray with post-traumatic stress disorder as a result of his 10 months working for CPL Resources, one of the firms Facebook pays to handle its content moderation — and now, he's spearheading a lawsuit against Facebook Ireland and CPL for causing that psychological trauma.
"There are 40,000 people doing this shit," Gray told Vice News, a reference to the number of both contractors and direct employees moderating content for Facebook. "If I can get them better working conditions, better care, then that also improves the quality of the content moderation decisions and the impact on society."
Gray is one of 12 former content moderators listed as plaintiffs in the lawsuit. Another, Sean Burke, also spoke to Vice about how the experience affected his mental health.
"I've had to go on antidepressants because of working on the job," Burke told Vice. "At times I was solving my problems with alcohol to get to sleep because at least I wasn't dreaming when I slept after having a few drinks on me."
In March, two former moderators filed a similar lawsuit against Facebook in California, alleging that their work for the company had caused them psychological trauma.
However, as pointed out by Vice, this latest lawsuit could cause bigger problems for Facebook, because California's workplace-safety rules aren't as strict as Europe's.
Additionally, Irish courts might be more willing that United States ones to force Facebook to disclose key details about the content that moderators view, which could help the plaintiffs' case that the company should be doing more to protect their mental health.
"They are going to need to disclose just how much toxic content people are exposed to on a daily basis," Cori Crider, director of United Kingdom-based advocacy group Foxglove, which is assisting the plaintiffs in the lawsuit, told Vice. "How many beheadings? How much child pornography? How much animal torture? It's awful, but this is real stuff moderators are dealing with all the time."
READ MORE: Bestiality, Stabbings and Child Porn: Why Facebook Moderators Are Suing the Company for Trauma [Vice]
More on Facebook moderation: Former Content Moderators Are Suing Facebook Over PTSD and Trauma
Share This Article