Moderating content for Facebook is traumatic. That's not an opinion — it's a fact.

Thousands of people spend their work days deciding whether posts violate Facebook's content policies. And a growing number have spoken to the media about the terrible toll of seeing countless images and videos depicting violence, sex abuse, child pornography, and torture. In March 2018, one moderator in Tampa, Florida, actually died right at his desk.

That man, Keith Utley, was employed by a firm called Cognizant, which reportedly signed a two-year, $200 million contract with Facebook to keep the platform free of objectionable content — and, in a huge blow to Facebook's moderation strategy, it just announced it'll cut ties with the social media company when that contract runs out.

"We have determined that certain content work in our digital operations practice is not in line with our strategic vision for the company, and we intend to exit this work over time," Cognizant told BBC News. "This work is largely focused on determining whether certain content violates client standards — and can involve objectionable materials."

"In the meantime, we will honor our existing obligations to the small number of clients affected and will transition, over time, as those commitments begin to wind down," the firm later added. "In some cases, that may happen over 2020, but some contracts may take longer."

BBC News wrote that the decision will lead to the loss of an estimated 6,000 jobs and affect both the Tampa moderation site and one in Phoenix, Arizona.

"We respect Cognizant's decision to exit some of its content review services for social media platforms," Facebook's Arun Chandra told BBC News. "Their content reviewers have been invaluable in keeping our platforms safe — and we'll work with our partners during this transition to ensure there's no impact on our ability to review content and keep people safe."

Cognizant wasn't Facebook's sole source of content moderators — the company has 20 review sites employing approximately 15,000 people across the globe. But even that army of moderators hasn't been enough to prevent policy-violating content from slipping through the cracks.

Perhaps most notably, an Australian man used Facebook to livestream an assault on two mosques in New Zealand in March that led to the deaths of 51 people. Not only did the bloody video rack up thousands of views before Facebook's moderators took it down, but the company struggled to remove copies of the footage from its platform in the aftermath of the slaughter.

If Facebook considers its platform "safe" now, it's hard to imagine what it could look like if the social network doesn't quickly replace the Cognizant employees currently comprising more than a third of its moderation work force.

Editor's Note: This article was updated to correct the citizenship of the man who attacked the New Zealand mosques.

READ MORE: Facebook moderation firm Cognizant quits [BBC News]

More on Facebook moderation: Facebook Moderators Are Dying at Their Desks


Share This Article