Some accounts uploaded the videos straight to the platform while X turned a blind eye.
Verified Horrible
Verified accounts on Elon Musk's social media platform X-formerly-Twitter are promoting the sale of child sex abuse videos.
As the BBC found, the identified accounts were using Saudi Arabic phrases and hashtags to link to websites that sell the appalling content. Other associated accounts even went as far as to upload child sex abuse material (CSAM) straight to the platform.
It's a harsh reminder of just how meaningless the company's purported "verification" system has become — while CSAM is flourishing in the absence of effective content moderation efforts.
After the broadcaster identified the accounts to X, the company promptly removed them. Yet some accounts got thousands of views, while linking to Telegram channels purportedly containing CSAM.
"These accounts have been suspended in accordance with our policies," a spokesperson told the BBC, as translated from Arabic by Google, but refused to elaborate on how X's verification system works.
In case you needed any reminding of how much Twitter's 'verification' system has changed, see this from Fahima Abdulrahman and myself: https://t.co/qYRySGDWwe pic.twitter.com/cWlAoNhWki
— Joe Tidy BBC News (@joetidy) November 21, 2024
Free for All
It's a particularly problematic situation given the fact that verified users with blue checkmarks on the site are paying a monthly fee to jump the line and get far more visibility on the Musk-led platform.
In simple terms, X is effectively making money off the sale of CSAM being openly promoted on its platform.
Some of the identified accounts were active for over a year, highlighting the platform's failure to address the issue in a timely manner.
That's despite Meta's Instagram and Facebook banning related keywords that were being used by these accounts earlier this year, according to the report.
None of this should be particularly surprising. In the first couple of months after Musk took over the platform in late 2022, the team gutted the 20-person team responsible for preventing child sexual exploitation.
According to February 2023 reporting by the New York Times, the company failed to keep child abuse content off the platform, with CSAM widely circulating.
That's despite Musk making lofty promises about making "removing child exploitation" his "priority #1" three months earlier.
In May, the European Commission requested information on X's content moderation and how much resources it was committing to those efforts. The Commission pointed to an X transparency report, which revealed that its content moderation team was cut by almost 20 percent since October 2023.
Given the latest news, its content moderation in other parts of the world — the Arab-speaking world in particular — is woefully inadequate as well.
Worst of all, X's content moderation practices remain a black box. In October, an Australian court upheld an order for Musk's company to pay a fine of almost half a million dollars after failing to cooperate with regulators in a request for information regarding anti-child-abuse practices.
More on CSAM: Elon Musk's Twitter Is Reportedly Failing Miserably to Contain Horrific Child Abuse Content
Share This Article