"For too long we, the workers powering the AI revolution, were treated as different and less than moderators."
In a historic step for outsourced tech labor, over 150 African content moderators — whose work has underpinned AI systems at Facebook, TikTok, and OpenAI — have voted to unionize, Time reports.
The newly minted African Content Moderators Union is entirely made up of current and former employees of third-party moderation contractors, according to the report. Those contractors include a popular firm called Sama, which has held lucrative contracts with both Facebook and OpenAI.
"For too long we, the workers powering the AI revolution, were treated as different and less than moderators," Richard Mathenge, a former ChatGPT content moderator, told Time. "Our work is just as important and it is also dangerous."
"We took a historic step today," he added. "The way is long but we are determined to fight on so that people are not abused the way we were."
From AI to social media, content moderation is one of the most difficult and burdensome jobs in Silicon Valley. Many of those who work in the field are subjected to graphic and traumatizing content.
But it's also one of the most essential positions in the space right now — without moderation, other users could be unwittingly exposed to problematic content.
Some of these workers are also helping the likes of OpenAI train their AI models, turning them from racist nonsense-spewing bots into helpful assistants.
And yet, reports throughout the years have painted a grim picture of moderators' work conditions and compensation.
Sama, for example, has been accused of paying their workers less than $2 a day for their work moderating OpenAI's training data, while Facebook has been shown to pay its moderators much less than it pays its highly-compensated engineers.
All to say: this historic union is an important and necessary step forward for contracted moderators, and will hopefully help to protect them as we move further into an AI-powered future.
"It takes a village to solve a problem, but today the Kenyan moderators formed an army," Martha Dark, co-director of a non-profit NGO called Foxglove, told Time in a statement. "From TikTok to Facebook, these people face the same issues. Toxic content, no mental health care, precarious work — these are systemic failures in content moderation."
More on AI content moderation: OpenAI Apparently Paid People in the Developing World $2/Hour to Look at the Most Disturbing Content Imaginable
Share This Article