1. Home
  2. Music

Music

AI-generated Drake song up for Grammy nomination

A hit track made from an AI-generated vocal track of the rapper Drake has been submitted to the Recording Academy for Grammy nomination in two categories.

A viral track from the anonymous producer “Ghostwriter” using an artificial intelligence (AI)-generated vocal track of the rapper Drake has been submitted for consideration for a Grammy award, according to a Sept. 5 New York Times report.

The track “Heart on My Sleeve” has been submitted by the Ghostwriter team to the Recording Academy, the organization behind the Grammys, for nomination in Best Rap Song and Song of the Year, a representative told the NYTs.

In both of those categories, the award is attributed to the songwriter, who was also confirmed by the Ghostwriter representative to be a human. 

Earlier this year the Grammys updated its policy for awards in the upcoming award season saying that music with AI-generated components is eligible for an award. The catch is that the category for which the track is nominated must be for a human-created portion of the song.

The CEO of the Grammys, Harvey Mason Jr. confirmed this saying music with AI elements is “absolutely eligible” for Grammy nomination. He reiterated that to the NYTs regarding the Ghostwriter AI-Drake track saying:

“As far as the creative side, it’s absolutely eligible because it was written by a human.”

He also pointed out that the Academy also looks at whether or not the song was commercially available, which is a component of Grammy rules that says a track must have “general distribution” to be eligible, which includes availability on streaming platforms. 

Related: AI can be a ‘creative amplifier’ — Grammy chief exec Harvey Mason Jr.

However, “Heart on My Sleeve” was removed from all major streaming services, despite industry experts saying that its use of AI fell into a “legal gray area.”

Cointelegraph reached out to the Recording Academy for further comment on the situation. 

Back in April, Universal Music Group, one of the industry’s most prominent record labels, sent out a mass email to major streaming services, including Spotify and Apple Music, requesting they block AI services from harvesting melodies and lyrics from copyrighted songs and remove songs in violation of copyright. 

Shortly after, Spotify was reported to have ramped up policing of tracks in violation of copyright infringement on its platform, as well as blocking artificial streaming of songs to increase listen count.

Magazine: BitCulture: Fine art on Solana, AI music, podcast + book reviews

DAO Maker hack victims still await reimbursement 3 years later

Justin Bieber hit track becomes NFT for royalty-sharing

Justin Bieber’s hit song from 2015 “Company” has been turned into an NFT by anotherblock from which fans could earn royalties from future streams of the track.

On Sep. 7, the blockchain-centric music tech platform anotherblock and the co- producer of the track Andreas Schuller, also known as Axident, will drop the track “Company” as an NFT with only 2,000 available.

The global pop icon Justin Beiber will have one of his hit songs turned into a nonfungible token (NFT), which fans can then invest in a stake of the royalties. 

While Axident will still remain the rights holder, fans owning an NFT of the track will then be able to participate in a 1% royalty share in the streams of the song. Axident spoke to the significance of fans in the music industry and how blockchain and music rights open up an entirely new world of interactions.

“The ownership of a song bears far greater weight than its monetary value; it forges a profound connection to the art that, until now, has been beyond the reach of fans.”

Upon the song’s initial release as a single, it made it into Australia’s top forty and landed at 53 on the U.S. charts. To date, the track has massed over 500 million streams. 

Related: YouTube releases ‘principles’ for working with music industry on AI tech

Cointelegraph spoke with Michel D. Traore, the co-founder and CEO of anotherblock, who said that delivering universally beloved songs is a powerful way of “acquainting” the masses to Web3. 

“"Company" serves as a prime example of a song with immense potential in this regard. Through the Web3 tools, we can offer music enthusiasts something that, until recently, hasn't been accessible to the general public.”

Traore highlighted that the royalty share is being divested by Axident. “Producers often do not receive the recognition they truly deserve,” he said “and one significant aspect for us is to shine a light on the creators behind the songs.”

“This not only creates a new revenue stream for producers but also provides fans with the opportunity to discover and connect with these talented individuals and their incredible work.”

Anotherblock has already worked on Web3 ventures with some of the largest names in the music industry, including pop sensations The Weeknd, Rihanna and Martin Garrix.

Magazine: Tokenizing music royalties as NFTs could help the next Taylor Swift

DAO Maker hack victims still await reimbursement 3 years later

How can the synergy of AI and blockchain disrupt the music industry?

While AI-generated songs are flooding the music market, blockchain technology can ensure transparency and ethical standards are maintained within the industry.

The synergy of AI and blockchain can revolutionize the music industry by enhancing artists' creative capabilities while preserving transparency and fair revenue distribution among creators. 

AI is increasingly being used as a tool for creating new songs or imitating existing music content.  That is why distinguishing human-generated music from AI-generated content is becoming increasingly difficult. 

Moreover, AI content is often trained on pre-existing content mostly without its original creators' knowledge or consent, creating a number of ethical and legal issues, for which there are no clear-cut solutions currently given the rapid evolution of the technology.

“Because of how the generative AI works, you can't really tell what has gone into it and how that results in what comes out,” Hanna Kahlert, analyst at MIDia Research, told Cointelegraph.

"The problems that this poses is misinformation and deep-fakes, just a lack of trust in what you can see on the Internet,” she noted. 

According to Kahlert, blockchain technology can provide a solution to these issues by tracking the provenance of specific music content, making sure copyrights are respected and artists receive their fair compensation.

Moreover, blockchain technology can make the relationship between artists and fans more direct, bypassing middlemen like labels and centralized streaming services such as Spotify.

To find out more about the possible synergies of AI and blockchain technology in the music industry, don’t miss the latest Cointelegraph Report on our YouTube channel and don’t forget to subscribe.

Magazine: Tokenizing music royalties as NFTs could help the next Taylor Swift

DAO Maker hack victims still await reimbursement 3 years later

YouTube releases ‘principles’ for working with music industry on AI tech

The streaming platform revealed three key principles it will abide by as it works with major figures in the music industry to better grasp AI technologies.

YouTube released its “principles” for working with players in the music industry on artificial intelligence (AI) technology, on Aug. 21. 

The CEO of YouTube, Neal Mohan, outlined in a blog post that now is a critical time to “ boldly embrace this technology with a continued commitment to responsibility.”

Mohan said he has been in close contact with partners in the music industry, including Universal Music, to create a framework with three major principles surrounding AI to work toward “common goals.”

“These three fundamental AI principles serve to enhance music’s unique creative expression while also protecting music artists and the integrity of their work.”

The principles are to embrace AI responsibly alongside partners in the music industry, including appropriate protections for artists that “unlock opportunities” for music partners and scaling content policies to meet the “challenges of AI.”

Included in its first principle, YouTube announced it will be introducing a new “Music AI Incubator” to inform its approach to working with AI. The incubator will work with “some of music’s most innovative artists, songwriters, and producers across the industry.”

This includes the composer Max Richter and singer-songwriter Rosanne Cash among others.

Mohan closed by saying he is “incredibly excited” about how AI presents opportunities to “supercharge creativity” around the world. 

Related: Meta launches suite of generative AI music tools rivaling Google’s MusicLM

Although the YouTube CEO expressed excitement and praise for the emerging technology, he also expressed that he believes AI won’t be a replacement for human creativity.

“AI will never replace human creativity because it will always lack the essential spark that drives the most talented artists to do their best work, which is intention."

In an interview with Cointelegraph, the CEO of the Grammy’s echoed this sentiment and said that there is “something about the human experience and the emotion and the heart that comes from what we do and what we contribute to the music that I don’t think can be replicated yet.”

In July, the Grammy’s clarified its rules by saying that music with AI components is eligible for an award, depending on the use of the technology and the award category for which it is nominated.

Earlier this month on Aug. 9, Universal Music Group announced that it is in talks with Google over a deal to collaborate in combating AI deep fakes.

Magazine: BitCulture: Fine art on Solana, AI music, podcast + book reviews

DAO Maker hack victims still await reimbursement 3 years later

Universal Music and Google in talks over deal to combat AI deep fakes: Report

Universal Music and Google are reportedly in negotiations over a tool that would allow for the creation of AI tracks using artists’ likenesses in a legal way.

Universal Music Group — one of the world’s leading music companies — and Google are in negotiations to license melodies and vocal tracks of artists to be used in songs generated by artificial intelligence (AI), according to a report from the Financial Times. 

The talks have been confirmed by what the FT reports are “four people familiar with the matter.” The companies are reportedly aiming to create a partnership between the music industry and Big Tech in order to manage the rampant emergence of AI-generated deep fakes.

Mainstream AI usage has sparked concern among major music industry leaders due to the amount of “deep fakes” using musicians’ likenesses. Clips of AI-generated Drake and Kanye West began to go viral around April. Many have since been taken down.

Reportedly, the discussions between the two industry giants are still in the early stages, with no impending product launch or guidelines. However, the FT sources say the goal is to develop a tool for creating tracks legally with copyrights rightly attributed. 

The sources said that artists would have the right to opt in for their voices and music to be used. Another source claimed that Warner Music Group (WMG) has also been in conversation with Google regarding a similar product.

Cointelegraph reached out to WMG for further information but has not received a response.

Related: Music with AI elements can win a Grammy, Recording Academy CEO says

In April, Universal Music Group asked streaming services like Spotify to remove all AI-generated content due to copyright infringement.

A few weeks later, Spotify said it was ramping up policing of the platform and began actively taking down content in violation.

However, some artists are fully on board with their voices being used in AI-generated music. Grimes said she’s eager to be a “guinea pig” for this type of content and will split royalties 50/50 with the creators. 

She also created Elf Tech, alongside a team of developers, which is her own voice simulation program available for public use.

Google and Meta have recently launched their own tools called Music LM and AudioCraft to create music and audio using generative AI.

Many in creative industries are worried about the implications of AI being used to create artistic and creative products. However, in an interview between Cointelegraph and the CEO of the Recording Academy, he said AI can be used as a “creative amplifier.”

Magazine: BitCulture: Fine art on Solana, AI music, podcast + book reviews

DAO Maker hack victims still await reimbursement 3 years later

Meta launches suite of generative AI music tools rivaling Google’s MusicLM

Meta’s latest suite of generative AI tools allows users to input text to create musical and audio compositions, rivaling a similar tool released this year by Google.

Meta, the parent company of Facebook and Instagram, launched a suite of generative AI models on Aug. 2, which it calls AudioCraft, for the purpose of music creation from various inputs, according to a blog post

Included in the suite of generative AI tools are MusicGen and AudioGen, which operate off of text-based inputs to create new audio, along with another called EnCodec that “allows for higher quality music generation with fewer artifacts.”

In the announcement, Meta mentioned that its MusicGen model was trained with music that it owns or “specifically licensed.”

This comes as there have already been major controversy surrounding training AI with copyrighted work across many artistic fields, including a lawsuit against Meta for copyright infringement during AI training.

At the moment, Meta has made MusicGen and AudioGen available in several sizes to the “research community” and developers. It said as it develops more advanced controls it envisions the models to become useful to both amateurs and professionals in the music industry.

“With even more controls, we think MusicGen can turn into a new type of instrument — just like synthesizers when they first appeared.”

In a recent interview between Cointelegraph and the CEO of the Recording Academy, Harvey Mason Jr., he also likened the emergence of AI-generated music to the early days of synthesizers coming onto the music scene.

Related: Spotify reportedly deletes thousands of AI-generated songs

Meta’s release of its generative AI music tools comes shortly after Google launched its set of similar tools that turns text into music called MusicLM. 

In May, the company announced that it was accepting “early testers” of the products via its AI Test Kitchen platform.

Meta has been actively releasing new AI tools, alongside many other tech giants including Google and Microsoft, in a race to develop and deploy the most powerful models.

On Aug. 1 Meta announced the launch of new AI chatbots with personalities, which users on its platforms will be able to use as search helpers and as a “fun product to play with.”

Magazine: BitCulture: Fine art on Solana, AI music, podcast + book reviews

DAO Maker hack victims still await reimbursement 3 years later

Music with AI elements can win a Grammy, Recording Academy CEO says in report

The CEO of the Recording Academy, known for its yearly Grammy awards, reportedly clarified that music with AI-created elements is eligible for award nominations.

The music industry is coming to terms with the rise of artificial intelligence (AI) seeping into productions, as the Grammys recently clarified that AI-generated music will be eligible for awards in the 2024 award cycle. 

On July 4, the CEO and President of the Recording Academy, Harvey Mason Jr. clarified to the AP in an interview:

“AI, or music that contains AI-created elements is absolutely eligible for entry and for consideration for Grammy nomination. Period.”

He stressed that the Academy will not give a Grammy or a nomination thereof to the AI portion of the track. For example, if an AI voice model performs lead vocals on the track, it could be eligible in the songwriting category but not in the performance category and vice versa, the AP reported. 

This is because “what is performing is not a human creation,” Mason explained, according to the AP. He said that as long as human is contributing more, and in a meaningful way, it will “always” be considered. For now, the Academy says it will not be awarding AI itself any awards. 

“We don’t want to see technology replace human creativity. We want to make sure technology is enhancing, embellishing, or additive to human creativity.”

These clarifications from the Academy CEO comesafter the organization updated its rules and eligibility criteria on June 28 that said, “a work that contains no human authorship is not eligible in any categories.” 

Related: Google updates its privacy policy to allow data scraping for AI training

AI prevalence has skyrocketed since the emergence and widespread use of the AI chatbot ChatGPT since November 2022.

Some artists, such as Grimes, have openly welcomed the use of the technology, and even said she’s willing to split 50% royalties with any creators who use her voice track for a successful song.

On the other hand, the rapper Ice Cube, famous for his work in the late 80s and early 90s, called with N.W.A., called AI demonic and said he would sue anyone mimicking his voice on AI tracks, along with the platforms that host the song. 

According to Mason, the Grammy’s have been heavily considering how to deal with the inclusion or exclusion of AI and even held a summit with industry leaders on the future of AI in music.

He said he’s “imagining” that the technology will be involved in a lot of records and songs this year. To that point, two days after the new Grammy rules were announced Paul McCartney revealed “the last Beatles record” was produced with AI extractions of John Lennon’s voice.

Cointelegraph reached out to the Recording Academy for further comment but didn't get an immediate response. 

Magazine: BitCulture: Fine art on Solana, AI music, podcast + book reviews

DAO Maker hack victims still await reimbursement 3 years later

Robot guest-conducts the Korean national symphony orchestra

The android EveR 6 and conductor Choi Soo-yeoul co-conducted a performance of six pieces with the Korean National Symphony Orchestra.

A robot graced the stage of the National Theatre of Korea to conduct the country’s national symphony orchestra on June 30, marking South Korea’s first robo-conductor public appearance. 

EveR 6, the android that co-conducted the performance titled ‘Absence,’ is a design produced by the Korea Institute of Industrial Technology (KITECH). The robot is equipped with a humanoid face and has a human-life form with a torso, two arms, a neck and head.

KITECH trained EveR 6 through the use of “motion capture” technology via sensor attachments that digitally record a conductor’s baton trajectory. The robot is also trained to keep track of the speed of the baton’s movements.

Prior to the performance, the National Theatre of Korea released a teaser video on its YouTube channel, showing glimpses into the rehearsal and training process. 

The robot was joined on stage by conductor Choi Soo-yeoul, who also co-conducted the performance. Choi is reported to have said that one of the most challenging aspects for robots is “real-time interaction and communication,” particularly in the musical context. 

He said EveR 6’s “critical weakness” is that it cannot listen. However, Choi also said that "the robot was able to present such detailed moves much better than I had imagined."

Choi and EveR 6 both took turns conducting pieces, the robot guiding three of five pieces, and then performed one-piece side by side. After the concert Choi said: 

"It was a recital that showed that (robots and humans) can co-exist and complement each other, rather than one replacing the other."

Cointelegraph reached out to the Korean National Symphony Orchestra for comments from the musicians who were conducted by the android, but didn't immediately get a response.

Related: Spotify reportedly deletes thousands of AI-generated songs

Audience members in attendance were mixed, with one concertgoer Lee Young-ji commenting on the robot’s skill to keep rhythm. The attendee said it lacked “breath” and that:

"It seemed there was some work to be done for the robot to do the job.”

Another audience member, Song In-ho, said the robot performed at a very basic level and would be able to do more if equipped with an artificial intelligence (AI) system that could help it understand and analyze the music.

Nonetheless, the performance was the first of its kind in South Korea. Previously, a robot created by Honda called Asimo, guest-conducted a performance with the Detroit Symphony Orchestra in 2008.

In 2017, a robot named YuMi guest-conducted a performance in Switzerland. Lastly, in 2020 a Japanese-designed robot named Alter 3 conducted a 7-minute long piece of music deemed an ‘android opera’ and entitled Scary Beauty.

Magazine: AI Eye: AI travel booking hilariously bad, 3 weird uses for ChatGPT, crypto plugins

DAO Maker hack victims still await reimbursement 3 years later

Web3 community-building meets music technology at Wavelengths Summit 2023

Crypto natives and newbies alike came together to build community and share alpha on the latest in music technology and decentralization.

Web3 has become one of the hottest buzzwords in the music industry, with everyone from independent musicians to major label artists dropping nonfungible token (NFT) collections and throwing concerts in the metaverse. But for many, the actual use cases and potential of these technologies remain shrouded in mystery and confusion.

On May 6, Water & Music held its inaugural Wavelengths Summit, a one-day event bringing together musicians, industry executives, artist managers, researchers and technologists to explore the bleeding edge of music technology and democratize access to information. On the agenda were talks about blockchain-based communities, the growing influence of artificial intelligence on the music industry and the future of artist revenue streams.

Water & Music is a collaborative music technology research network founded in 2016 by writer Cherie Hu as a free newsletter. It has since evolved to encompass a paid membership structure, an extensive online collaboration network and in-person events. Its research often touches upon Web3 and how blockchain impacts the music industry.

“I think the music industry, in particular, has suffered from information silos,” Hu told Cointelegraph. “If you’re trying to figure out how fans interact with your music in a holistic way, it’s actually a huge challenge.” Enter Water & Music, which seeks to empower its community with the knowledge needed to thrive in the digital era.

Community

A central focus of both Water & Music as an organization and its Wavelengths Summit was building a sense of community. The event’s emphasis on the importance of community-building in music and Web3 was ever-present, from the topics chosen for discussion — including sessions titled “Music Community Building and Decentralization: Lessons from History” and “URL to IRL: Uniting Music Communities Online and Offline” — to the way the event itself was hosted and organized.

For instance, Hu opened the summit by laying out four ground rules for positive community-building: “Be kind and respectful,” “Stay critical, “No shilling,” and “Have fun!” She also announced that there would be no panels; instead, experts would facilitate conversations, with audience members encouraged to jump in at any point. Talks on the main stage were accompanied by a large screen displaying live comments and questions from audience members via an app called Slido.

“I think what we were really aiming for is recreating the magic of our Water & Music Discord,” Diana Gremore, Water & Music’s events director, told Cointelegraph. “We have such a thoughtful, articulate, critical, passionate, curious community, so we wanted to do our best to facilitate how that URL community translates into an IRL experience.”

Web3 community building for musicians

Throughout the day, many of the conversations touched on how Web3 and blockchain technologies are being explored in the world of music. During the “Music Community Building and Decentralization” session, participants discussed how online communities such as decentralized autonomous organizations (DAOs) are the next step in a long history of decentralization.

As pointed out by Austin Robey, co-founder of Metalabel — which is building a blockchain-based platform for collaborative artist releases — on-chain voting and governance are digital versions of what real-world communities have always done. Social spaces are always governed, and communities are always decision-making. And while DAOs may be subject to “code,” real-world communities have always been subject to social “codes.”

The discussion was moderated by Kaitlyn Davies, membership lead at Friends With Benefits — a social DAO for creatives — and head of curatorial partnerships at Refraction — a DAO for artists and creators with a particular focus on live music events. Davies told Cointelegraph that the preexisting decentralization in music communities helps explain why so many in the music world gravitate toward Web3.

“You see a lot of people who have always been interested in decentralized ways of organizing or sort of left-of-center means of organizing look to this technology to keep doing their work — not even to get bigger or to cast a further net but just to enable what they were already doing,” she said, adding:

“Cultivating a scene or a community, that’s really important, and that’s what drives culture. [...] My hope still is that decentralized tech helps us do that better and helps us do that in more equitable ways.”

During the “Web3: Balancing Niche and Mainstream on the Road to Adoption” session, participants discussed the importance of first understanding one’s community before launching crypto music projects. Melanie McClain, a Web3 consultant and founder of Blurred Lines — a community of Web3 tastemakers supporting left-of-center Black music — said that if fans want free shows, artists can experiment with NFTs that give collectors free access to concerts. And if the artist blows up, that free-performance NFT will suddenly become much more valuable.

Related: Music NFTs are helping independent creators monetize and build a fanbase

Speaking to Cointelegraph, McClain said that crypto-native and crypto-newbie artists alike could use blockchain tech to build stronger communities, but each approach must be tailored. “They have to be self-aware,” she said. If a musician’s community is not native to Web3, “they might not say words like NFTs or social tokens. They can lead the conversation in other ways while still using the tools in the back end.”

Many facilitators and other attendees expressed that Web3 solutions offer particularly unique advantages for musicians, with Gremore telling Cointelegraph that “one of the biggest strengths of [Web3] is the ability to build community and sustain community.”

Perhaps part of the reason for this is that blockchains are generally designed for efficiency. According to Hu, this allows artists and their teams to better utilize “smart money” — when a musician doesn’t have much money to spend and therefore must use their funds as efficiently as possible.

“In music and Web3, I’m noticing instead of just random artists dropping NFT projects that happen to gain a lot of money, there’s more focus on ‘what’s the actual use case?’” Hu told Cointelegraph. “What is blockchain actually adding to music in a way that makes things easier and not harder from a technical standpoint?”

URL meets IRL

One thing that stood out at the Wavelengths Summit was how many online friends were meeting IRL — in real life — for the first time. Having many internet friends is not unique to crypto, but it is particularly pronounced in the space, given its inherently decentralized nature. For most people, meeting an online friend in person is special, and the summit was designed to facilitate those connections.

The internet allows for a level of community building previously impossible, especially between musicians and their fans. But as Gremore told Cointelegraph, “There’s a magic in IRL that just can’t be replaced.” She added, “URL is where so many of the conversations start happening, and then IRL — it’s a chance to deepen those bonds.”

Summit attendees connect and network during the “Web3 Happy Hour.” Source: Jonathan DeYoung

For Hu, building in-person relationships is critical for the long-term success of Web3 communities. “IRL events make or break trust in a community,” she said. When internet-based communities meet in person, that community’s carefully curated online image disappears, and people see it for what it really is — whether good or bad.

“Events are so important for online communities because if the name of the game is long-term sustainability, that will make or break trust. If it succeeds, it could be a huge kickstarter to a whole new stage or a whole new level for the community or for the brand. But I’ve definitely seen it go the other way around also.”

For those unable to participate in IRL experiences, online ones still offer opportunities, such as allowing fans to connect virtually with their favorite music artists. “I think using virtual things, not necessarily the metaverse but using live-streaming platforms, things like that — I think you can simulate the same thing,” McClain said. “Everybody can participate no matter where they are.”

“I think online spaces are safe havens for a lot of people, and I think that that should never be discounted,” believes Davies. “But I think the power of meeting somebody in person and being like, oh, you’re like a real human being, and we have similar thoughts about this, and maybe a block on a chain helped us find each other — but really what it’s about is us hanging out in person.”

Magazine: AI Eye: ‘Biggest ever’ leap in AI, cool new tools, AIs are the real DAOs

Ultimately, the main takeaway of the Wavelengths Summit was that community-building is a critical component for success in both music and Web3, and Water & Music intentionally designed its inaugural summit to set an example of how it believes community-building should look.

To close out the day, Gremore shared with the audience that Water & Music wanted attendees to leave empowered — that even though it may seem like the music industry is broken, there is still light at the end of the tunnel. And as the summit revealed, some of that hope may come in the form of DAOs, NFTs or other blockchain-based tools that help artists build community directly with their fans. Or, as Gremore told the audience:

“We’re fucked — but maybe we can do something about it.”

DAO Maker hack victims still await reimbursement 3 years later

Spotify reportedly deletes thousands of AI-generated songs

The music streaming platform has removed tens of thousands of AI-generated songs created on the AI music-making platform Boomy.

The battle between the music industry and artificial intelligence (AI) continues as reports claim Spotify is taking down AI-generated music

A Financial Times (FT) report revealed that the music streaming platform had removed 7% of songs created by the AI music startup Boomy, amounting to “tens of thousands” of songs.

Spotify is also said to be ramping up its policing of the platform in light of the situation.

This comes after Spotify and other streaming services began receiving complaints of fraud and clutter on the platform. Music industry giant Universal Music Group (UMG) alerted streaming service providers of “suspicious streaming activity” on Boomy tracks, according to FT sources.

Ultimately, the Boomy songs were removed due to the suspected “artificial streaming” of bots posing as listeners, Spotify commented, saying:

“Artificial streaming is a longstanding, industry-wide issue that Spotify is working to stamp out across our service.”

Representatives from Boomy said the platform is “categorically against” all manipulation or artificial streaming of any kind. 

Related: Hollywood studios reject banning AI from writer’s rooms

Lucian Grainge, CEO at Universal Music Group commented to investors:

“The recent explosive development in generative AI will, if left unchecked, both increase the flood of unwanted content on platforms and create rights issues with respect to existing copyright law”.

Last month, UMG emailed streaming services, including Spotify, to block AI services from accessing music catalogs for training purposes. UMG has also sent requests “left and right” to remove AI-generated songs from platforms.

While music industry giants are fighting to control AI, other artists like Grimes are championing the technology. The musician permitted creators to use her voice and be a “guinea pig” for AI music creation as long as a small set of rules were followed, and royalties were split. 

Magazine: How to control the AIs and incentivize the humans with crypto

DAO Maker hack victims still await reimbursement 3 years later