Global Governance of AI Songwriting
Patrycja Rozbicka, Simon Barber, Nicholas Gebhardt and Craig Hamilton explore the implications and possible responses to the rise of AI in the music industry. This is the sixth post in a EGG commentary series exploring how AI’s development is affecting economic, social and political decision-making around the world.
Over the last few decades, artificial intelligence has begun to profoundly reshape our musical experiences (Miranda, 2021; Behr, Negus and Street 2018). Technologies designed to assist musicians in creating songs have proliferated, while streaming services have transformed our listening habits through their use of algorithms (Hamilton, 2019). While AI has been used within music-making since the 1960s, we still lack an adequate understanding of its social and cultural impact. This is particularly the case when it comes to the governance structures involved in AI song production, from frameworks for regulating data to copyright issues. These are crucial questions given the size and influence of the global music industry, which generated US$26.2 billion in 2022. Below, we zoom in on a number of explanatory examples and pinpoint a need for more robust governance mechanisms to face challenges accompanying AI-assisted songwriting.
How AI is transforming songwriting
One change accompanying the rise of AI in the music industry relates to the intent of AI-assisted, as opposed to human-only, songwriting. Consider the following three examples. Incubation activity within Abbey Road Red, the innovation department of Abbey Road Studios, has led to a high level of investment in the growth of AI music companies like Humtap, Lickd, Vochlea and Lifescore. Projects like DeepMind’s Wavenet (2016), MuseGAN (2017) and MuseNet (2019) have been built on music generation in various musical genres and styles using subsets of AI activity such as natural language processing, neural networks, and machine learning. OpenAI's Jukebox, a neural net that generates vocal performances, has produced an alternative genre devoted to uncanny ‘deep fakes’ of famous performers like Frank Sinatra (Robertson, 2020). Despite a strong sense of collaboration between artists and AI, the ultimate aim of these technologies is to create ‘a hyper-realistic and expressive voice that is not distinguishable from real humans’ (Stassen 2021). This raises questions about the integrity of existing bodies of musical work as AI uses these as data to improve its own songwriting abilities and generate new content.
These questions are becoming more pressing as major companies such as IBM, Sony, Google and Spotify have established labs to experiment with AI music creation. François Pachet, who leads the AI research arm at Spotify, oversaw the development of Sony’s ‘Flow Machines’, using them to generate material in the style of The Beatles. He invited songwriter Benoît Carré (aka SKYGGE) to complete the songs, culminating in the 2018 release of Hello World, which was ‘the first multi-artist commercial album created using artificial intelligence’ (Nazim, 2018).
Societal and governance implications
While changes to songwriting processes and creativity or the technological investment decisions of media companies may seem far removed from social and governance processes, these transformations are in fact raising several important issues related to societal and governance challenges. For instance, the examples above open a Pandora’s box when thinking about copyright law. Copyright law is one of the cornerstones of business activity in the music industries (Drott, 2020) and rests on the right of individual(s) to be recognised as creators. It is not currently clear how the music generated with AI systems will impact the legal framework within which creators currently operate, and especially the organisations which control and administer copyright. Who will own the work in question? How will creators establish copyright? And what happens when the relationships between different parties break down?
Further, as major companies get more involved in AI-facilitated songwriting, critics worry that these decisions may reinforce existing social biases while doing little to enhance accountability for socio-cultural outcomes. Debates around Google's treatment of AI researchers (Simonite, 2021), as well as reports of inherent biases in its Vision AI architecture, for example, have highlighted many of the tensions within the AI community asking 'Who Is Making Sure the A.I. Machines Aren’t Racist?' (Metz, 2021). The representative bodies for marginalised groups such as Queer in AI, Black in AI and Widening NLP have taken steps to publicly denounce funding opportunities from Google. In 2022, Capitol Music Group severed ties with popular AI rapper, FN Meka, after complaints from the black community about how the musician represented “gross stereotypes” and used offensive language on recordings (Cain, 2022).
These implications are all the more concerning for their potential to exaggerate, or at a minimum perpetuate, general and pre-existing social biases in AI research and technologies. A 2019 Nesta report on Gender Diversity in AI Research noted that only 14 per cent of authors are women, and in the UK just 27 per cent of AI papers have at least one female co-author. A lack of racial diversity is similarly apparent. This has implications for how we understand the usefulness of AI technologies as well as how they are applied. A key challenge for AI developers and governance, therefore, is understanding how to address historical biases and inequities which human-technological collaboration – intentionally or unintentionally - preserves.
In this context, examining how governance can contribute to better outcomes for society and the music industry seems sensible. Given the global character of the creative industries and a high-level investment from audio streaming services and companies (e.g. Spotify, Google), we recommend a double-tier approach. First, we need a global governance solution that provides unified standards for AI songwriting (including for example, copyright regulation). Second, we should look into self-governance by companies and large conglomerates based on voluntary agreements and the creation of ‘watch dogs’ and initiatives to monitor corporate behaviour in the field.
Within academia, these issues are being explored through initiatives like Just AI, a network of researchers at the Ada Lovelace Institute, who are developing ethical policies and best practices in AI, particularly regarding concerns about privacy, algorithmic bias, fairness, trust and transparency. In the music industries, despite the backdrop of the recent DCMS enquiry into the economics of music streaming, there are continuous challenges to the fair remuneration of music creators and rights holders. Multinational music corporations like Universal Music Group devote much energy to blocking ‘infringing’ content, including working with Spotify and Apple to police the emergence of AI-generated songs cloning the sounds of popular artists (Nicolaou, 2023). However, the recent controversy caused by the release, and subsequent takedown, of the song ‘heart on my sleeve’ by Ghostwriter, which provided an AI illusion of a new collaboration between Drake and The Weeknd, demonstrated that important legal and creative questions may remain unanswered for some time to come.
Dr. Patrycja Rozbicka is a Senior Lecturer in Politics and International Relations at Aston University, Birmingham, UK. Her research focuses on regulation around live music industry, interest groups and lobbying. She is a coordinator of the Live Music Project (LMP) and a head of the Birmingham Live Music Project (BLMP).
Dr. Simon Barber is a Senior Research Fellow at Birmingham City University. He leads the Songwriting Studies Research Network and is the co-producer and co-presenter of the popular Sodajerker songwriting podcast.
Nicholas Gebhardt is Professor of Jazz and Popular Music Studies at Birmingham City University and the author of Vaudeville Melodies: Popular Musicians and Mass Entertainment in American Culture, 1870-1929 (University of Chicago Press, 2017).
Dr. Craig Hamilton was Research Fellow in the Birmingham Centre for Media and Cultural Research (BCMCR) at Birmingham City University, UK. His research explores the role of digital, data and 13 internet technologies in the business and cultural environments of popular music.
Photo by TStudio