
AI may be coming for royalties, too.
-
The Velvet Sundown, a band on Spotify, was recently accused of being generated by AI.
-
The situation prompted a question of how users can know when a track they’re listening to is created by AI.
-
While there are some ways to detect if a song was generated by AI, they’re not foolproof.
-
Artists shouldn’t worry about being replaced by AI, however, it does cause difficulties for up-and-coming musicians and composers.
Stumbling upon content online generated by artificial intelligence (AI) shouldn’t be surprising, and neither should questioning its origin be.
Recently, a craze started around a band on Spotify called The Velvet Sundown. But not the kind of craze where listeners rave about the music. Rather, many were concerned that the band that now has more than 750,000 monthly listeners and almost 21,000 followers on Spotify was actually created by AI.
In a world where the percentage of AI-generated content is increasing, the lack of transparency angers users. While music streaming platforms, such as Apple Music and Spotify, offer verification for the artists, they don’t flag AI-generated music. The exception to this is Deezer, which includes an AI detection filter.
Considering this – and The Velvet Sundown situation, where the band didn’t provide much proof of actual existence and claims that they’ve been impersonated by someone else – is there a way for us to know whether what we listen to wasn’t generated by robots?
AI-generated track detection is still in the works
Some of us might think we have a pretty good grasp of authenticating content. A number of hard-to-believe videos circulating online and repetitive wording in an article might hint that it’s a product of AI. But how can AI be detected in music if that’s possible at all?
“Right now, there’s no foolproof tool that can instantly tell you if a song was made by AI, but there are some patterns (like unnatural phrasing, repetitive melodies, or inconsistent production choices) that can be signs,” explains the artist under stage name theRave, from a band Spells and Curses who is also a software engineer for a tech accelerator OS Labs.
The expert shared that while some companies are creating “detection tools using audio fingerprinting or watermarking techniques, they're not widely available or standardized yet.”
One way users can check if the track was made by AI is by using an AI audio splitter that splits a song into its individual tracks.
“A lot of times, if you hear artifacts or any other kind of noise in a solo track that is inconsistent with what that track is supposed to contain, it is because the AI audio splitter was confused as to what was supposed to be focused in on, and that's an issue that only happens on AI-generated tracks since they are trained/created on MP3 tracks that don't have the best audio resolution,” explained the artist.
Considering that there’s no one certain way to figure out whether a track was created by AI, humans often follow their gut, shares Nicole Russin-McFarland, film score composer.
She also draws attention to how we react to the use of AI, which can now be detected in books, films, music, and more. “The ‘he’s using AI!’ witch hunts done on every art form these days is beyond ridiculous because people have no idea how much good AI costs,” she said, referring to computer-generated imagery (CGI) that film studios use to recreate images and animation.
“AI is not all good, or all evil. We’re really just in a weird time in history right now,” concludes the expert.
Is it the end for human artists?
At the end of the day, as listeners, we might not care that much about who created a track, as long as it sounds good. Nevertheless, the use of AI in the music industry may impact musicians and composers. The artist from Spells and Curses shares that “it does very much so make it harder for emerging human artists to be discovered or fairly compensated.”
He explained that AI decreases the visibility of real artists, making them harder to recognize. It also impacts their financial situation. For example, a music streaming platform such as Spotify uses a prorated payment method, which pays artists based on “how much of the [...] ecosystem they can lay claim to.”
Adhering to such a payment system and fighting AI for your spot aren’t ideal working conditions for artists.
Nevertheless, Russin-McFarland offers another perspective: that musicians won’t be replaced by AI – a fear shared by people from many other industries.
“AI might fill the voids of easy background music for YouTube videos. It is never going to stop people from seeing a glamorous woman sing live like Taylor Swift, or build a fanbase around you like Frank Ocean or Lorde,” the composer said.
So, while people may still enjoy well-composed pieces and live performances, AI does put a spoke in artists' wheels, especially for those who are just starting their careers.
Who’s behind AI-generated music?
The situation with The Velvet Sundown also raises the question of who is behind these artificial music pieces and what they gain from this.
The experts reveal that anyone could be behind this with different goals in mind.
“The people uploading AI-generated tracks could be developers experimenting with models, marketers trying to ride search trends, or opportunists gaming music streaming royalties,” shared a member of Spells and Curses.
When it comes to what these people get from such music composing, Russin-McFarland stressed that musicians and composers shouldn’t be afraid of AI coming for them, as revenue from Spotify isn’t the only source of income.
Artists' songs performing well on TikTok, other social media platforms, or TV are also a way to gain recognition.
“So much of the AI scare is overshadowing important things like the value of being your own business as a composer with royalty collections, being your own music publisher when you can, getting a formal music deal, and a lot that should be more relevant,” Russin-McFarland said.
Your email address will not be published. Required fields are markedmarked