Image by Aline Berry/Simon Steinberger/Tag Hartman-Simkins

Making Sausage

The people who moderate Facebook spend their days reading and watching conspiracy theories, violent murders, and hate speech, often hundreds of times per day.

Many of them — who actually work for a professional services company called Cognizant — don't make it a full year of consuming the absolute worst of the internet before quitting, The Verge's Casey Newton found in a chilling exposé. Meanwhile, the moderators are coping by getting high and having sex during work.

The revelations raise the question: who should be tasked with keeping everyday users safe from the internet's most toxic content, and how can they be supported themselves?

Closed Doors

Newton spoke to a number of current and former Facebook content moderators under the condition of anonymity. They painted a picture of a deeply flawed system and a workplace environment that takes a deep personal toll on employees' mental health.

"Part of the reason I left was how unsafe I felt in my own home and my own skin," one pseudonymous employee told The Verge after describing how he was frequently accosted by other employees and started carrying a gun to protect himself.

Other employees developed various trauma-related disorders as a result of having to watch murders, hate crimes, and other graphic acts that people posted to Facebook or Instagram, according to The Verge.

Some began to believe the content of the content they ultimately removed — various employees told The Verge about how they and their colleagues fell prey to conspiracy theories. One moderator mentioned colleagues who deny that the Holocaust or Parkland shootings ever happened, and one former employee who spoke to The Verge came to believe that 9/11 was an inside job.

Coping Mechanisms

In order to cope with their jobs, many of the employees admit to doing drugs or having sex at work.

"I can’t even tell you how many people I’ve smoked with," one employee told The Verge. "It's so sad, when I think back about it — it really does hurt my heart. We’d go down and get stoned and go back to work. That’s not professional."

Not surprisingly, the system doesn't seem to work very well. Moderators are judged on the accuracy of their decisions to remove posts, which is determined entirely by whether or not their manager would have made the same decision and cited the same policy.

That means they could both be wrong, or one of them could be using an outdated version of the rulebook — a common problem, since Cognizant employees told The Verge how internal policies could be changed several times per day, especially in the wake of breaking news. But those updates come through on a Facebook-like algorithm, where posts and rule changes are often presented out of order, like a typical Facebook timeline.

"Accuracy is only judged by agreement," one employee said. "If me and the auditor both allow the obvious sale of heroin, Cognizant was 'correct,' because we both agreed. This number is fake."

READ MORE: The Trauma Floor [The Verge]

More on Facebook: Facebook Needs Humans *And* Algorithms To Filter Hate Speech


Share This Article