- The songs generated by AI by deceased artists, like Blaze Foley, were falsely downloaded on Spotify
- The streaming service makes them fall because they are spotted
- The tracks have slipped the processes of verification of Spotify content via platforms like Soundon
Last week, a new country song entitled “Together” appeared on Spotify under the official artist page of Blaze Foley, a country artist shot and killed in 1989. The ballad was different from his other work, but he was there: the cover, the credits and the information on the copyright – like any other new single. Except that it was not an unearthed piece before his death; It was a false generated by AI.
After being reported by the fans and the Foley label, Lost Art Records, and reported by 404 Media, the track was deleted. Another false song attributed at the end of the Country Guy Clark icon, who died in 2016, was also withdrawn.
The report revealed that the tracks generated by the AI transported copyright tags listing a business called syntax error as owner, although we know little about them. Strugging on songs made in AI on Spotify is not unusual. There are whole reading lists of Lo-Fi rhythms generated by the machine and ambient chillcore which already ract millions of pieces. However, these tracks are generally presented under names of imaginary artists and generally have their origin mentioned.
The attribution is what makes the Foley case unusual. A song generated by a-telechanted place in the wrong place and falsely linked to real deceased human beings is many stages beyond the simple sharing of sounds created by AI.
Synthetic music integrated directly into the legacy of musicians who have long been without the authorization of their families or their labels is an escalation of longtime debate on the content generated by the AI. The fact that this happened on a giant platform like Spotify and was not taken by the own streamer tools is naturally disturbing.
And unlike certain cases where music generated by AI is transmitted in homage or experience, these have been processed as official versions. They appeared in the artists’ discography. The latter controversy adds the disturbing wrinkle of real artists distorted by counterfeits.
Posthumous artists
As for what happened at the end of Spotify, the company allocated the download on Soundon, a music distributor belonging to Tiktok.
“The content in question violates the deception content policies of Spotify, which prohibit the identity intended to mislead, such as reproducing the name, image or description of another creator, or pretending to be a person, a brand or an organization in a deceptive manner,” said Spotify in a press release in 404.
“This is not allowed. We take measures against license conceders and distributors who do not reach the police for this type of fraud and those who commit repeated or flagrant violations can and have been permanently Spotify. ”
The fact that it was removed is great, but the fact that the track appeared at all suggests a problem to report these problems earlier. Since Spotify treats tens of thousands of new tracks daily, the need for automation is obvious. However, this means that there can be no verification of the origins of a track as long as the technical requirements are met.
This counts not only for artistic reasons, but as a question of ethics and economics. When generative AI can be used to make false songs in the name of dead musicians, and there is no immediate or infallible mechanism to stop it, then you must wonder how artists can prove who they are and obtain the credit and the royalties they or their areas have won.
Apple Music and YouTube also had trouble filtering Deepfake content. And as AI tools like Suno and Udio, it is easier than generating songs in a few seconds, with words and voices to correspond, the problem will only grow.
There are verification processes that can be used, as well as construction labels and watermark in content generated by AI. However, the platforms that hierarchire rationalized downloads may not be fans of the additional time and the efforts involved.
AI can be an excellent tool to help produce and improve music, but it uses AI as a tool, not as a mask. If an AI generates a track and is labeled as such, it’s great. But if someone intentionally passes which works within the framework of the heritage of an artist, especially that he can no longer defend, it is fraud. This may seem a minor aspect of the debates on AI, but people care about music and what is happening in this industry could have repercussions in all other aspects of the use of AI.