The UK’s AI Copyright Report: More Questions Than Answers

In a previous blog, I wrote about: can we tell AI music apart – and who owns it? and mentioned that by 18 March the government would publish the consultation results. The Report and impact assessment on Copyright and Artificial Intelligence is now out, as promised. Other than a few concrete statements, across the report the government’s approach is more ‘wait and see’, rather than decisive; it repeatedly emphasises the lack of consensus and the uncertainty of the evidence.  

What does the report say?

The consultation provided four options for copyright and AI policy:

Option 0: Do nothing (status quo): copyright and related laws remain as they are

  • Only received support from 10% of the respondents 1

Option 1:  Strengthen copyright requiring licensing in all cases (no new exception)

  • Received support from 81% of the respondents 2

Option 2: A broad data mining exception

  • Only received lower than 3% support 3

Option 3: A data mining exception with opt-out and transparency measures

  • Only received support from 3% of the respondents 4

Prior to the consultation, the government preferred option 3. However, it was rejected by most respondents, who were concerned that it would ‘undermine the value of [creative industry players’] work’ and that ‘opt-out would be impractical’, as well as concerns that ‘this option would be more restrictive than the approach taken in other countries’ 5.

After the consultation, this approach is no longer the preferred approach 6. The government decided not to introduce reforms to copyright law until it is confident that the reforms can achieve the intended balance 7.

Considering that the UK’s creative industries contributed around £146 billion to the economy in 2024 8, and the AI sector is expected to contribute between £20 and £90 billion to the UK economy by 2030 9, it is indeed difficult to balance protecting the UK’s position as a creative powerhouse and allowing AI to grow the economy.

One of the notable concrete proposals concerns the rule on computer-generated works, which I also discussed in my previous blog. Aligning with most responses to the consultation, the government proposed that the protection for works generated solely by AI without a human author – referring to Section 9(3) of the Copyright, Designs and Patents Act 1988 (CDPA) – should be removed, while copyright should ‘continue to protect works created with AI assistance’ 10.

The report also signalled possible new legislation on digital replicas: the government considered ‘if it would be beneficial to introduce a new digital replica or personality right’ 11.

The government also clearly stated that it will not intervene in the licensing market at this stage 12.

What do we say?

On the evening of 18 March, when the report was published, I happened to attend a seminar organised by PCAM and PRS about music administration. PCAM (The Society for Producers and Composers of Applied Music) is a professional body representing composers and music creators. Chris Green (composer and founder of Blurred Edge), when giving the opening introduction for the event, was very pleased with the government’s report that dropped the opt-out approach.

The ‘wait and see’ approach makes sense given how the market is still developing. I had a chat with Chris Cooke at Music, Copyright, and Contracts in the Age of AI at UCL two weeks before the release of the government report. When I asked him about his views on the upcoming report, he said that he doubted there would be anything concrete, especially given recent licensing and settlement deals (for example between Udio and Universal Music Group, and between Suno and Warner Music Group). He suggested that the government would likely wait and see how the market develops. As the government states in the report, it will ‘continue to monitor developments in technology, litigation, international approaches, and the licensing market.’ 13.

The report also acknowledged that law is local, while the AI market is international. I think approaches will ultimately more or less converge globally. Because the market is global, if a country adopts particularly strict rules on AI, companies may simply move to more lenient jurisdictions. Over time, this may create pressure for rules to converge.

Practical challenges of transparency

The government acknowledged that the importance of transparency to right holders and proposes to develop best practice on input transparency, which may lead to future legislation, as well as output transparency 14. Over 90% of the respondents agreed that AI developers should disclose the sources of their training materials 15. The difficulty is how.

On transparency, I recall a discussion I heard at the event Regulation and AI Autonomy at Queen Mary University of London on 23 March. One member of the audience raised an interesting point: for him, there is no transparency issue in AI, because as a software developer, he believes he understands how the system works and, if there is a mistake, he can identify what happened.

This raised an interesting distinction between internal technical transparency and external transparency. The external transparency concerns whether creators, regulators, and users can know what data has been used and how it has been used.

However, from my limited understanding and from what I have heard from other software developer friends, internal transparency may not always be easy to establish. Before modern machine learning models such as LLMs existed, developers wrote code that directly determined how a system responds, so they could often identify which line of code caused a problem. But now, it is not always clear where an error originates.

So even internal transparency is not that straightforward, and the measures and strategies that AI companies are willing to adopt to achieve external transparency add another layer of complexity.

Overall, we are in an interesting moment to witness how law, technology and the market interact, with the path forward still far from settled.

  1. Section B of the Report, Para 123
  2. Section C of the Report, Para 71
  3. Section C of the Report, Para 118
  4. Section C of the Report, Para 16
  5. Section A of the Report, Para 17
  6. Section A of the Report, Para 27
  7. Section A of the Report, Para 18
  8. Para 20 of the Impact Assessment
  9. Para 29 of the Impact Assessment
  10. Section A of the Report, Paras 46 and 47
  11. Section A of the Report, Para 49
  12. Section A of the Report, Para 40
  13. Section A of the Report, Para 27
  14. Section A of the Report, Paras 30 and 33
  15. Section D of the Report, Para 29
error: Content is protected !!