Back to blog

Spotify Publishes AI-Generated Songs on Deceased Artists’ Pages

AL

Albert

Published 22 July 2025

Spotify Publishes AI-Generated Songs on Deceased Artists’ Pages

Spotify, the world’s largest music streaming platform, is under fire once again but this time, for something deeply unsettling. In recent weeks, it has been revealed that AI-generated songs were uploaded under the official profiles of deceased musicians, without the consent or knowledge of their estates or rights holders. The incident has triggered a global outcry from artists, fans, and industry watchdogs, raising urgent questions about artistic legacy, consent, and the unchecked rise of AI in the creative arts.

AI Songs from the Afterlife?

Investigations by 404 Media, AXS TV, and others revealed that multiple AI-generated tracks, complete with artificial vocals, lyrics, and artwork were mysteriously published on verified Spotify pages of musicians long deceased. Among the most controversial:

"Together" was published under the profile of Blaze Foley, a revered country singer who was murdered in 1989.

"Happened To You" appeared on the official page of Guy Clark, who passed away in 2016.

These tracks were neither recorded nor authorized by the artists’ estates. According to Craig McDonald, who manages Foley’s catalog, the track was “clearly the work of an AI schlock bot,” grossly misrepresenting Foley’s distinct musical voice. Both tracks were later taken down after intense public backlash and media exposure.

“[The song] sounded like a bad parody of Blaze’s work... We never approved this. It’s deeply disrespectful.”
- Craig McDonald, Label Manager


How Did This Happen?

The incident traces back to unauthorized third-party distributors exploiting Spotify’s relatively open upload system. A shadowy entity called Syntax Error has been linked to several of these AI uploads.

These distributors appear to have bypassed verification protocols and wrongfully published AI content directly under official artist profiles, blending fiction with fact in a way that misled listeners and distorted legacies.

The Outrage: Artists, Estates, and Fans Respond

The reaction from the music community has been swift and fiery:

  1. Estates & Labels: Furious over the unauthorized use of their artists' names and legacies. Many are calling for stricter upload verification and legal action.
  2. Industry Groups: Bodies like the British Phonographic Industry (BPI) have warned of a rising tide of “AI models trained on copyrighted material without compensation or permission.”
  3. Fans: Outrage spread across Reddit, Twitter, and music forums, with users calling this a “new low” for digital exploitation.

Many described it as digital necromancy.

“Spotify just let an AI write a fake Blaze Foley song... What’s next, AI Prince albums?”
- Reddit user folkrootsfan87

Spotify’s Response: A Patch, Not a Solution

Spotify did remove the AI-generated tracks after reports began to circulate. The platform cited a violation of its Deceptive Content policy, which prohibits impersonation or misleading artist representation. However, critics say the response was reactive rather than proactive and Spotify’s existing systems failed to catch the uploads before they went live. Furthermore, the company has not yet implemented AI content labeling, unlike rival platform Deezer, which now flags synthetic songs.

This isn’t Spotify’s first AI-related scandal. In 2023, it purged tens of thousands of AI-generated songs, many created by Boomy and similar platforms, amid allegations of fraudulent streaming manipulation. But the Blaze Foley and Guy Clark incidents mark a disturbing escalation: AI music is now being published under real names without their creators being alive to object.

The Bigger Picture: Who Owns the Voice of the Dead?

The controversy opens up larger, more complex ethical and legal questions:

  1. Authenticity & Consent: Can an AI ever ethically represent a deceased artist?
  2. Ownership: Who controls a musician’s posthumous identity? Their estate? Their label? Spotify?
  3. Transparency: Should AI-generated music be clearly labeled to protect consumer trust?
  4. Cultural Memory: Are we okay with machines rewriting musical legacies?

As generative AI grows in power and accessibility, these questions demand urgent answers.

What the Industry Is Calling For

In the wake of the controversy, calls are mounting for:

  1. Stronger Upload Verification: To prevent AI or malicious actors from publishing under verified artists' names without review.
  2. Mandatory AI Labeling: Platforms should clearly mark AI-generated tracks, especially those using deepfake vocals or legacy names.
  3. Estate Control & Legal Oversight: Deceased artists' estates must have explicit control over how (or if) their voice, name, and likeness are used.
  4. Regulatory Intervention: Many in the industry believe governments must intervene to regulate the use of AI in media, especially for deepfakes and voice cloning.

Final Thoughts: The Cost of Innovation

What started as a tool for creativity is now a vehicle for identity theft at scale. Generative AI is not inherently evil—but when misused, it can desecrate legacies, deceive audiences, and devalue real art.

Spotify, as a gatekeeper of global music, must evolve faster than the tools that threaten its very foundation. Otherwise, we risk waking up in a world where your favorite artist never actually sang their last song and where machines speak louder than the dead ever could.

Comments
0

Please log in to post a comment.

No comments yet. Be the first to share your thoughts!

Related Articles

Loading related articles...