- Culture
- 10 Aug 23
The talks come amidst worries about the impact of AI on the music industry and beyond, as some people raise concerns that artists aren't able to consent to their voices and melodies being used.
Google and Universal Music are trying to develop a platform that allows users to create AI-generated music, while allowing the original artists to give consent and get paid for their work being used.
One of the most popular trends on social media platform TikTok right now is accounts replacing the original voice of a song with the voice of a popular artist: for example, The Weeknd singing a Taylor Swift song, or Drake singing an Ice Spice song.
In fact, an AI generated song that appeared to feature vocals from an artificially generated Drake and the Weeknd went viral. It gathered over 15 million views on TikTok and 600,000 Spotify streams, before it was eventually pulled from streaming services by Universal Music Group. UMG claimed the song was “infringing content with generative AI”.
In these cases, the artists were not able to consent to their musical likeness being used, as is the case with the vast majority of AI generated songs. This has caused concern from within the industry, as people are worried about both consent as well as jobs being replaced with artificial intelligence.
Advertisement
In response, UMG and Google are looking to create a platform in which AI-generated music can be created, while making sure the artist gives consent and gets suitably compensated. The talks are in early stages, and an actual platform has yet to be announced.
Artists and have expressed concerns over the use of AI in the music industry, as well as across other creative areas like the film industry. One of the main goals of the current WGA writer's strike and the SAG-AFTRA actor's strike is to ensure job protections against AI, as the bigger companies and labels could cut down on costs via its use.