Victor Tangermann
Disturbing Content

YouTube Content Moderator Sues Over Psychological Trauma

byDan Robitzski
9. 22. 20
Victor Tangermann

She claims she had to watch videos of beheadings, shootings, child abuse and more.

Workplace Trauma

YouTube is under fire: A new lawsuit claims that it failed to support the content moderators who watch and remove inappropriate or violent videos uploaded to the site.

The lawsuit was filed Monday by a former content moderator who says she had to watch videos of beheadings, shootings, child abuse, and other disturbing content, according to NBC News. As a result — and because YouTube didn’t offer medical support — she experienced nightmares, panic attacks, and found herself unable to stay in crowded areas.

Overworked, Undersupported

The anonymous plaintiff, who like other moderators was staffed by a third-party agency, alleges that moderation teams were understaffed to the point that workers often had to exceed the recommended four hours per day of scanning violent videos, NBC reports. That comes out to between 100 and 300 videos per day, with very little room for error.

Meanwhile, the suit claims, the YouTube “Wellness Coaches” available to content moderators only worked limited hours and weren’t actually licensed to offer medical advice, leaving moderators to find and pay for their own mental healthcare.

Advertisement

Big Problem

The lawsuit echoes similar content moderation lawsuits filed against Facebook, further illustrating tension around how the tech industry regulates user-generated content. 

The problem isn’t going away any time soon. YouTube plans to rely more heavily on human moderators after its algorithmic content moderation system made too many errors, the Financial Times reports.

At the same time, it’s tough to prescribe a better strategy for tech giants. Sure, they could hire their content moderators, pay them well, and provide them with benefits and mental health support. But artificial intelligence isn’t yet good enough to take over human judgment on policing horrible content — so, for now, someone has to do it.

READ MORE: Former YouTube content moderator describes horrors of the job in new lawsuit [NBC News]

Advertisement

More on content moderation: Facebook Moderators Are Dying at Their Desks


Care about supporting clean energy adoption? Find out how much money (and planet!) you could save by switching to solar power at UnderstandSolar.com. By signing up through this link, Futurism.com may receive a small commission.

Share This Article

Copyright ©, Camden Media Inc All Rights Reserved. See our User Agreement, Privacy Policy and Data Use Policy. The material on this site may not be reproduced, distributed, transmitted, cached or otherwise used, except with prior written permission of Futurism. Fonts by Typekit and Monotype.