AI-Generated Music Is Flooding Spotify
And Real Artists Are Paying the Price
A New Era of Music Fraud
The age of AI-generated music has arrived. Unfortunately, it’s also being used to scam music fans and rob real artists of their earnings. A report from Slate exposes a disturbing trend: scammers are flooding Spotify with fake bands using AI-generated covers to gain streams and revenue.
The Discovery by Country Music Fans
A group of observant country music fans on Reddit uncovered what they call a “stream-stealing scam.” These fake artists, with names like Highway Outlaws and Waterfront Wranglers, had hundreds of thousands of streams. They had no original songs, lacked social media presence, and their bios read like they were written by ChatGPT.
One Reddit moderator noticed a suspicious band. When they checked the “similar to” section, they discovered dozens of AI-generated artists with the same characteristics. All of them were part of mainstream playlists like summer country vibes, gaining massive listens through seemingly fraudulent means.
Digging Into the Source
Slate reached out to 11A, a so-called label representing these fake artists. The company claimed real artists were behind the music. However, their credibility was questionable. Their Facebook page had only 117 followers and hadn’t been updated in three years. Even worse, their domain had expired.
When asked for concrete proof, the spokesperson went silent.
Spotify’s Passive Stance
Surprisingly, Spotify took little action. According to a spokesperson, “Spotify does not have a policy against artists using autotune or AI tools as long as they don’t violate our deceptive content policy.” In other words, AI-generated music isn’t banned unless it impersonates someone else.
Spotify also said it didn’t remove the AI content; the content providers did. This response raises concerns about Spotify’s ability or willingness to control what gets uploaded.
Not Just Country Music
This issue isn’t limited to the country genre. Redditor calibuildr pointed out that AI scams have plagued jazz, electronic, and ambient playlists for years. Even the blog Metal Sucks discovered fake AI-generated metalcore tracks that mimicked real bands.
The Growing Threat to Authenticity
AI tools have lowered the barrier to music production. However, when used unethically, they create a minefield for listeners and artists alike. Real musicians face unfair competition. Their work is buried under a flood of cheap, computer-generated imitations.
Moreover, AI-generated bands bypass the hard work of building an audience, engaging with fans, and creating original content. Instead, they exploit Spotify’s algorithms and playlist culture to earn revenue dishonestly.
Who’s Responsible?
Spotify seems reluctant to take a firm stance. This leaves the burden on actual music labels and artists to monitor AI misuse. Unfortunately, by the time fraud is discovered, the damage is often done.
Platforms like Spotify must implement stricter verification for artist uploads and invest in AI-detection tools. Otherwise, the trend will only grow, hurting creativity and trust in digital music.
As AI-generated music infiltrates major platforms like Spotify, the responsibility to protect authenticity becomes more urgent. Without better oversight, real artists will continue to suffer, and listeners will find it harder to separate genuine art from algorithmic fraud.