Beyond Disturbing

Former Content Moderators Sue Facebook for Giving Them PTSD

"You would wake up, and you’re remembering the video of someone machine-gunning people in the Middle East somewhere..."

12. 5. 19 by Kristin Houser
Victor Tangermann
Image by Victor Tangermann

As a content moderator for Facebook, it was literally Dublin resident Chris Gray’s job to watch child abuse, animal torture, and executions — disturbing imagery that left his mental health in shambles.

“You would wake up and you’re remembering the video of someone machine-gunning people in the Middle East somewhere,” he told The Guardian, “trying to think whether there was an ISIS flag, and so whether it should be marked as terrorism-related or not.”

“It took me a year after I left to realize how much I’d been affected by the job,” Gray continued. “I don’t sleep well, I get in stupid arguments, have trouble focusing.”

A doctor diagnosed Gray with post-traumatic stress disorder as a result of his 10 months working for CPL Resources, one of the firms Facebook pays to handle its content moderation — and now, he’s spearheading a lawsuit against Facebook Ireland and CPL for causing that psychological trauma.

Advertisement

“There are 40,000 people doing this shit,” Gray told Vice News, a reference to the number of both contractors and direct employees moderating content for Facebook. “If I can get them better working conditions, better care, then that also improves the quality of the content moderation decisions and the impact on society.”

Gray is one of 12 former content moderators listed as plaintiffs in the lawsuit. Another, Sean Burke, also spoke to Vice about how the experience affected his mental health.

“I’ve had to go on antidepressants because of working on the job,” Burke told Vice. “At times I was solving my problems with alcohol to get to sleep because at least I wasn’t dreaming when I slept after having a few drinks on me.”

In March, two former moderators filed a similar lawsuit against Facebook in California, alleging that their work for the company had caused them psychological trauma.

Advertisement

However, as pointed out by Vice, this latest lawsuit could cause bigger problems for Facebook, because California’s workplace-safety rules aren’t as strict as Europe’s.

Additionally, Irish courts might be more willing that United States ones to force Facebook to disclose key details about the content that moderators view, which could help the plaintiffs’ case that the company should be doing more to protect their mental health.

“They are going to need to disclose just how much toxic content people are exposed to on a daily basis,” Cori Crider, director of United Kingdom-based advocacy group Foxglove, which is assisting the plaintiffs in the lawsuit, told Vice. “How many beheadings? How much child pornography? How much animal torture? It’s awful, but this is real stuff moderators are dealing with all the time.”

READ MORE: Bestiality, Stabbings and Child Porn: Why Facebook Moderators Are Suing the Company for Trauma [Vice]

Advertisement

More on Facebook moderation: Former Content Moderators Are Suing Facebook Over PTSD and Trauma


As a Futurism reader, we invite you join the Singularity Global Community, our parent company’s forum to discuss futuristic science & technology with like-minded people from all over the world. It’s free to join, sign up now!

Share This Article

Keep up.
Subscribe to our daily newsletter to keep in touch with the subjects shaping our future.
I understand and agree that registration on or use of this site constitutes agreement to its User Agreement and Privacy Policy

Advertisement

Copyright ©, Singularity Education Group All Rights Reserved. See our User Agreement, Privacy Policy and Cookie Statement. The material on this site may not be reproduced, distributed, transmitted, cached or otherwise used, except with prior written permission of Futurism. Fonts by Typekit and Monotype.