SoundCloud has altered its platform policies to require opt-ins for training generative AI models with artists' music following widespread user backlash, the company announced today in a letter from its CEO.

On Friday, Futurism broke the story that SoundCloud had quietly updated its Terms of Use (TOU) in February 2024 with language allowing it to train AI using users' uploaded content, which could include uploaded music.

The updated terms — which were flagged by users on Bluesky and X (formerly-Twitter) — included some exceptions to account for music and other content licensed under third parties. But the AI provision was overall extremely broad, and could feasibly grant the music-sharing site the right to funnel much of its vast content library into generative AI models as training material, whether now or in the future.

Though the change was made back in February 2024, it seemed like site users were largely unaware of the change. Artists responded with rage and frustration, taking to social media to express their anger at the company and, in many cases, claiming they'd deleted and scrubbed their accounts.

In response to the mess, SoundCloud issued a lengthy statement clarifying that, despite the provision's sweeping language, it hadn't used artists' music to train AI models. That included generative AI tools like large language models (LLMs) and music generation tools, according to SoundCloud.

Now, it looks like SoundCloud is doubling down on those promises — and changing its policies.

In the letter released today, SoundCloud CEO Eliah Seton conceded that SoundCloud's language around AI training was "too broad." To rectify that, said Seton, the company revised its user terms, which now bar SoundCloud from using artists' music to "train generative AI models that aim to replicate or synthesize your voice, music, or likeness" without the explicit consent of artists.

The new clause adds that should SoundCloud seek to use its artists' music to train generative AI, it would have to earn that consent through opt-in mechanisms — as opposed to opt-outs, which are notoriously slippery.

Seton also reiterated SoundCloud's commitment to blocking third parties from scraping SoundCloud for AI training data, and characterized the changes as a "formal commitment that any use of AI on SoundCloud will be based on consent, transparency, and artist control."

According to Seton, the initial AI policy change was a reflection of SoundCloud's internal use of AI for features like music discovery algorithms and Pro features, fraud detection, customer service, and platform personalization, among other features. SoundCloud also uses AI to target opted-in users with advertisements based on their perceived mood. It also allows users to upload AI-generated music, and boasts a slew of partnerships with platform-integrated AI music and generation tools.

If there's any moral here, it's that language matters, as do the voices of the artists who power creative platforms — especially in an era where data-hungry AI models and the companies that make them are looking to suck up valuable human-made content wherever they can.

Seton, for his part, promised that SoundCloud would "keep showing up with transparency."

"We're going to keep listening. And we're going to make sure you're informed and involved every step of the way," reads the letter. "Thanks for being a part of the SoundCloud community and for holding us accountable to the values we all share."

More on SoundCloud and AI: SoundCloud Quietly Updated Their Terms to Let AI Feast on Artists' Music


Share This Article