Peloton is cracking down.
Content Moderation
Peloton, the digital exercise bike company that went viral for a deeply uncomfortable ad, is now trying to purge its online community of slurs, political conspiracies, and hate speech.
While riding on a preset course on a Peloton bike, usernames of people who are currently doing the same or previously took the same ride pop up on the screen — but a troubling number of those usernames included racial slurs or were otherwise hateful, Axios reports. As a result, the company is finding itself in the unexpected position moderating problematic content — much like the public struggles of Facebook, Twitter and YouTube.
Tighter Rules
Axios reports that Peloton's online forums and communities were plagued by hate speech, political arguments, and baseless misinformation like the QAnon conspiracy theories. As a result, Peloton has banned political posts from its community Facebook group, as well as hate speech and bullying — which, to be fair, probably should have been banned from the start.
getting radicalized on Peloton pic.twitter.com/kh9UvfQKg7
— Drew Goins (@drewlgoins) October 6, 2020
Peloton will still allow usernames that promote political causes, Axios reports, just so long as they don't break any of the other rules.
New Pledge
Peloton's new rules, as described by Axios, sound more comprehensive than one might expect. They go beyond banning outright hate speech to include dog whistles, which are veiled or coded references to hateful and racist content — basically inside jokes for extremists.
"Peloton was built on community, inclusivity, and being the best version of yourself," a Peloton spokesperson told Axios. "We welcome members from all walks of life to have respectful and thought-provoking discussions... However, we have a zero-tolerance policy against hateful content."
READ MORE: Peloton is figuring out how to moderate extremist content [Axios]
More on content moderation: YouTube Content Moderator Sues Over Psychological Trauma
Share This Article