Swing and Miss

TikTok Censored the Words “Black Lives Matter” But Not “White Supremacy”

byDan Robitzski
Jul 13

TikTok's algorithmic moderation has a long history of harming marginalized groups.

TikTok Ban

The shortform video and social media app TikTok has once again been caught banning terms or unfairly flagging content in a way that harms already marginalized groups.

Most recently, users on the platform found out that the word “Black,” when used in phrases such as “Black Lives Matter,” “supporting black voices,” and “supporting black people” was being flagged as inappropriate content, MIT Technology Review reports. Meanwhile, one user demonstrated in a video that pledging support to neo-Nazis and white supremacists seemed to be allowed.

History Repeats

This isn’t TikTok’s first experience with wildly misfired content moderation. As MIT Tech reports, the app seems to go through the same cycle — marginalized groups suffers, prominent users draw media attention to the error, and then TikTok claims it was all a technical error before issuing a quick fix — on a surprisingly regular basis.

In the past, TikTok has also blocked phrases including “Asian women” and “intersex,” essentially denying the people in those communities a platform to talk about their issues.


“Hate speech and talking about hate speech can look very similar to an algorithm,” University of Colorado, Boulder tech ethicist Casey Fiesler told MIT Tech.

Basic Training

That’s a hard problem to fix, Fiesler conceded, but it would be a whole lot easier if TikTok — and other platforms that run into similar issues — would seek to learn from the communities that use it. As MIT Tech notes, TikTok’s algorithm wouldn’t flag Black creators as inappropriate if it had been trained better.

“I’m often more sympathetic to the challenges of these things than most people are,” Fiesler said. “But even I’m like, ‘really?’ At some point there’s patterns, and you should know what to look for.”

READ MORE: Welcome to TikTok’s endless cycle of censorship and mistakes [MIT Technology Review]


More on content moderation: Facebook Algorithm Flags Onions as “Overtly Sexualized”

Care about supporting clean energy adoption? Find out how much money (and planet!) you could save by switching to solar power at UnderstandSolar.com. By signing up through this link, Futurism.com may receive a small commission.

Share This Article

Copyright ©, Camden Media Inc All Rights Reserved. See our User Agreement, Privacy Policy and Data Use Policy. The material on this site may not be reproduced, distributed, transmitted, cached or otherwise used, except with prior written permission of Futurism. Fonts by Typekit and Monotype.