According to internal reports obtained by The Verge, Facebook engineers warned of a "massive ranking failure" that allowed misinformation to flood users' feeds.
Rather than suppressing fake news, the glitch reportedly caused misinformation to be boosted, spiking views by as much as 30 percent worldwide. Even Russian state propaganda made it through, despite being flagged by fact checkers working with the company.
The incident underlines how tightly Facebook controls what content gets visibility and what doesn't — and how little understanding the public, or even seemingly company insiders, have of the processes that make it all work in the first place.
The issue had been around since 2019, but only started being noticeable in October, meaning that potentially harmful content had been ranked higher for at least six months.
"We traced the root cause to a software bug and applied needed fixes," Joe Osborne, a spokesperson for Facebook's parent company Meta, told The Verge, but argued the bug "has not had any meaningful, long-term impact on our metrics."
It's an egregious lapse in competence that should give anybody pause. Despite the many calls on Facebook to shine a light on how it moderates content, the company has held its cards incredibly close to its chest — sometimes to the detriment of its billions of users.