Weeks after trending "indie rock band" The Velvet Sundown admitted its Spotify oeuvre was entirely generated by AI, news has emerged that the streaming company is using AI to publish new tunes by dead musicians.

Per reporting by 404 Media, Spotify is populating the profiles of long-dead artists with new AI-generated songs that have nothing to do with the deceased musicians — without the permission of their families or record labels.

The outlet uncovered, for instance, that American singer-songwriter Blaze Foley — who was tragically shot to death in 1989 — had "released" a new song called "Together" on his Spotify.

The new single featured the sounds of a male country singer, electric guitar, and piano, in what 404 called a "new, slow country song." The art uploaded alongside the song was an AI-generated image of a young singer with spiky long hair and a black leather jacket — nothing like the actual outlaw country singer.

"I can clearly tell you that this song is not Blaze, not anywhere near Blaze’s style, at all," said Craig McDonald, owner of the record label that manages Foley's catalogue. McDonald's wife originally alerted him to the new release, which has since been removed.

Like most AI images, AI-generated music is pretty easy to spot, thanks to its lack of dynamic energy, overly conventional form, and canned vocal quality.

"It’s kind of an AI schlock bot, if you will," McDonald continued. "It has nothing to do with the Blaze you know, that whole posting has the authenticity of an algorithm."

McDonald told 404 he was shocked that AI slop could pop up on Blaze Foley's Spotify page without his permission. Apparently, the song bore the copyright mark "Syntax Error," which appears on similar AI tracks infesting the pages of other artists like Guy Clark, who died in 2016.

"It's kind of surprising that Spotify doesn't have a security fix for this type of action, and I think the responsibility is all on Spotify," McDonald said. "They could fix this problem. One of their talented software engineers could stop this fraudulent practice in its tracks, if they had the will to do so. And I think they should take that responsibility and do something quickly."

For its part, Spotify issued a statement saying that the "content in question violates Spotify’s deceptive content policies, which prohibit impersonation intended to mislead, such as replicating another creator’s name, image, or description, or posing as a person, brand, or organization in a deceptive manner. This is not allowed. We take action against licensors and distributors who fail to police for this kind of fraud and those who commit repeated or egregious violations can and have been permanently removed from Spotify."

The whole thing comes as living musicians are being boxed out of streaming platforms like YouTube and Spotify by AI slop farms, leading many indie and developing musicians to leave the craft altogether.

Months ago, the French global music streaming platform Deezer launched an AI music detection tool to help combat the flood of AI spam. After six months of AI detection, the company revealed the startling statistic that about one-fifth of all music uploaded each day, or over 20,000 songs, was generated by AI.

More on music: Musician Who Died in 2021 Resurrected as Clump of Brain Matter, Now Composing New Music


Share This Article