Spotify is reportedly flooded with covers from AI bands



Spotify, a live streaming music platform, is reportedly flooded with music covers by AI “bands” sneaked onto the platform’s playlists. The AI covers are usually hidden among other large and publicly available covers by real artists, allowing them to attract millions of listeners and get paid.

Unlike other artists, the so-called artists for these covers, have no digital footprint on social media platforms, which has raised red flags about the originality of their covers. Additionally, their bios sound like they were generated on AI platforms like OpenAI’s ChatGPT.

Vigilant Spotify listeners raised the alarm

According to a Slate report, a group of people on Reddit discovered the suspicious trend and brought it to light. Initially, these were “bands” covering some classics in country music. However, a closer look at the bands unearthed a wider range of music covers across different genres and across decades. According to the Redditors, none of these bands have originals, which raised more suspicion.

“Apparently this has been going on for several years, with ambient music and with electronic music and jazz,” said Culibuildr, a Redditor who posted the original thread and asked to be identified by their handle.

“I think the new thing here is that with AI being this consumer product, anybody can make a thing with vocals now.”

Culibuildr.

Without being alerted, Culibuildr said it would be difficult for a listener to easily tell this was an AI cover.  A lawyer for 11A, a label that claimed to be representing some “artists” of some of this music said he had documents to show the involvement of human musicians in the production of the covers.

The lawyer however could not respond to further questions by Slate and failed to offer contact details for the label with the only trace being an expired domain, a Facebook profile with only 117 followers and last posted in 2021. This, according to Slate is another red flag for a label working in online business to have such a low digital footprint.

Exploitative tendencies increasing in the streaming business

A Spotify spokesperson insinuated content providers were responsible for the anomaly and the removal of the covers from the platform after it was red-flagged. Content providers could be anyone on the artists’ side, for instance, the “bands” themselves, their management, and the label. The Spotify spokesperson said in a statement that:

“Spotify does not have a policy against artists creating content using autotune or AI tools, as long as the content does not violate our other policies, including our deceptive content policy, which prohibits impersonation.”

An entertainment and music lawyer Cole Henderson revealed that this could be the work of third-party intermediaries that many artists use to upload and manage their music on streaming platforms.

“I think whoever actually distributes this might be nervous about the reports on it and they might have taken it down,” he said.

“People are finding better ways to exploit the streaming system, because technically, this isn’t streaming fraud. If they’re paying somebody to perform cover songs and then using covers to pull streams, that’s not illegal, it’s just exploitative.”

Cole Henderson .

According to the lawyer, the exploitation is mainly on the original songwriter, who also gets royalties but not as much as they would prefer and therefore unhappy with the craze where AI is performing their songs.

According to Cole, a company like Spotify pays a percentage of royalties to the record label, which is then divided among the parties involved in making the song.

However, for a cover song, this would include a cut for whoever is being covered. Streaming services however usually offer more money to the new performer of the song and their label as opposed to the original artists.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *