Comment It's about music industry control, not a fake band (Score 3, Insightful) 201
The Guardian article on the Velvet Sundown “hoax” frames AI-assisted music as a threat to authenticity, but the moral panic it describes is a smokescreen. The industry insiders calling for regulation aren't scared that AI made music. They're scared that people listened, liked it, and that it happened without their involvement and outside their revenue models. This has them quaking in their boots.
The fundamental issue isn't that AI generated a fictitious group and generated catchy songs for it. It's that the music industry is terrified of any creation it cannot gatekeep.
Let's be clear about the roles, here. Spotify and Deezer, and all the other streaming platforms, are not the music industry; they are just the latest winners in its relentless cycle of disruption. They are algorithmic landlords who rose from the chaos that MP3s and the internet inflicted on the record labels. Now that they are the gatekeepers, their calls for transparency are a predictable attempt to protect their turf. It's not about integrity; it's about protecting their toll-booth from people who are simply walking around it—a situation straight out of Monty Python's Black Knight sketch. This panic is about protecting platform metrics, managing licensing liabilities, and ensuring their role as indispensable revenue collectors isn't disrupted by the emergence of AI.
There are real ethical problems here, but the call for labeling isn't addressing them. The single greatest issue is the scraping of human-made music to train generative models without consent or compensation. If your work was ingested by a large language model that now generates profit for a tech giant, you are owed a cut. This requires a robust legal framework for licensing training data, not killing the tool.
Instead of focusing on this legitimate issue, the industry is demanding a purity test via their Spotify and Deezer conduits: labeling AI-generated music as if it's counterfeit. This isn't about disclosure; it's about delegitimizing a new creative tool and the independent artists who might use it.
The hypocrisy is stunning. We don’t demand labels for chart-toppers ghostwritten by committee or for tracks algorithmically engineered to bait playlists. The reaction to The Velvet Sundown proves the point. An AI-assisted project gets a million streams while its origin is unknown, and only when it's AI origin is surfaced is it declared a moral crisis. Success isn't a symptom of the problem; success is *the* problem.
Bob Dylan wrote "All Along the Watchtower," but Jimi Hendrix made it eternal. By the logic of the "labeling" camp, Hendrix's version would need a disclaimer: "Warning: Generated using another artist’s source material." It’s absurd. Artistic transformation has always been about building on what came before. When I first heard Velvet Sundown, my reaction was, “Nice—somebody loaded a lot of CSN&Y into the training set.” I like CSN&Y. And I liked hearing echoes of Southern Cross and Suite: Judy Blue Eyes in something new.
AI isn't replacing human creativity. It's extending it. Like the synthesizer, the sampler, Auto-Tune, and DAW tools like Ableton Live and Logic Pro, generative AI is simply the next tool in the toolkit. For anyone who remembers the outrage when Dylan went electric, this panic is depressingly predictable. The delicious irony is watching streaming platforms, which blew up the old industry model, now defend the very system they disrupted.
If you want to support human artists, don't demand we slap a scarlet letter on songs made with new instruments. Instead, fight for policies that give creators legal ownership and control over their work as training data. Demand fairer compensation from the platforms and AI models that regenerate their creativity for profit.
If we get that right, AI isn't a threat. It's just an instrument we are still learning to play.