As you might expect, the protests have been creative, even quirky. More than 1,000 artists, including Annie Lennox and Kate Bush, supported the release this week of a silent album containing nothing more than background studio noise. The 47-minute album called Is This What We Want? contains 12 tracks entitled: The. British. Government. Must. Not. Legalise. Music. Theft. To. Benefit. AI. Companies.
As a musical experience, the album — available on Spotify — is not highly recommended. Personally, I prefer John Cage’s 4’33’’, a three-movement composition in which the orchestra does not play a note, mainly because it’s shorter.
But this mute protest is part of a worldwide revolt by creative artists and content companies against the unauthorised use of their work by big technology firms. In the US, the Authors Guild and 17 individual authors, including Jodi Picoult and Jonathan Franzen, are pursuing a more traditional American form of protest by suing OpenAI and Microsoft for copyright infringement, alleging “systematic theft on a mass scale.” Japan’s Newspaper Publishers and Editors Association has also protested against AI companies “freeriding on the labour of news outlets”.
These disputes are a classic example of what happens when new technologies outpace laws written for an earlier era. When intellectual property laws were enacted, no one could have imagined a day when massive companies would scrape the entire internet as training data for their generative AI models then spew out convincing simulacra of poems, images, music and videos. But the principle that no one should profit from another’s intellectual property without consent should remain inviolable.
As in many other countries, the British government is currently struggling to realign principle and practice and update its intellectual property laws for the AI age. As the protests show, this is not easy. The creative industries are of critical importance to the British economy. By the government’s numbers, they contributed £124bn in gross added value to the economy in 2023, about 5 per cent of the total. On the other hand, the government is desperate to position the UK as an AI-friendly powerhouse, behind the US and China.
The UK government appears fearful of stepping out of line with the Trump administration over tech policy and also wants to distance itself from intrusive EU regulations. Last month, the government published an AI Opportunities Action Plan saying the current uncertainty around intellectual property needed to be urgently resolved. It has been consulting widely but is toying with “fair use” exemptions, which would be welcomed by AI companies.
What is partly overlooked in this debate is how desperate AI companies are to obtain fresh human-generated content to develop their models — and how much they would pay if they were able to do so easily and legally. “We need to find new economic models where creators can have new revenue streams,” Sam Altman, OpenAI’s chief executive, admitted in December.
As it happens, several start-ups are experimenting with such economic models, including ProRata.ai, TollBit and Human Native.ai. ProRata is developing an answer engine that would pay a share of an AI company’s revenue to content creators whenever their work appeared in its responses. TollBit enables AI bots and data scrapers to pay websites directly for their content and thereby reduce legal uncertainty. And Human Native is creating a two-sided marketplace allowing AI creators to license data from content creators.
Just as hackers pirated music from the record companies in the early 2000s — before the industry evolved and enabled consumers to pay to stream music online — so the creative industries are experiencing their own “Napster era,” argues James Smith, Human Native’s co-founder. Some of these creative businesses are already striking individual content licensing deals with AI companies: Axel Springer, News Corp and the FT have signed agreements with OpenAI while Agence France-Press has partnered with Mistral. Human Native is aiming to automate that process on a mass scale. “We want to be the infrastructure for enabling data commerce on the internet,” Smith tells me.
The biggest of many differences between the Napster era and today, however, is that the pirates are no longer small groups of hackers but giant corporations with lobbying muscle. Revised legislation may be essential to force their hands. But nascent market mechanisms are developing that could enable mutually beneficial solutions. If AI companies do not bite harder on that carrot they deserve to be hit with a big stick.
john.thornhill@ft.com