The UK Government Spent Five Years Thinking About AI and Copyright. Then Decided to Keep Thinking

The UK government published its Report on Copyright and Artificial Intelligence on 18 March 2026 under the Data (Use and Access) Act 2025. The report assessed the use of copyrighted works in AI training but introduced no legislative changes. Commercial AI training in the UK continues to require a licence. Businesses using or developing AI should audit their tools, review contracts for IP indemnity, and prepare for transparency obligations aligned with the EU AI Act. Written by Rory O'Keeffe, SCL-accredited Leading IT Lawyer and AI Committee member, Society for Computers and Law.

On 18 March 2026, after five years of consultation, debate, two parliamentary committees, over 11,500 responses, and what can only be described as a heroic feat of institutional procrastination, the UK government published its Report on Copyright and Artificial Intelligence.

The result? No new legislation. No preferred policy option. No timetable for reform. Just a 125-page report, a 52-page economic impact assessment, and a firm commitment to gather more evidence before doing anything at all.

If you run a business that uses AI, generates content with AI tools, or holds valuable intellectual property, you might reasonably ask: so what am I supposed to do now?

That is the right question. And unlike the government, I am going to give you a direct answer.

What the Report Actually Says

The government consulted on four options for how copyright law should apply to AI training. Option 0 was to do nothing. Option 1 was to strengthen copyright so that licensing is always required. Option 2 was a broad text and data mining exception that would let AI developers train freely on copyrighted works. Option 3 was the government's original preferred approach: a data mining exception with an opt-out for rights holders.

After receiving 11,520 responses, the government has dropped Option 3, ruled out Option 2, and declined to choose between Option 0 and Option 1. The creative industries overwhelmingly rejected the opt-out model. Only 3% of respondents supported it. AI developers were not enthusiastic either, warning that high opt-out rates would make the UK a less attractive location for AI training.

So the status quo holds. Using copyrighted works for commercial AI training in the UK still requires a licence, unless one of the narrow existing exceptions applies. But here is the problem: the government has also declined to introduce the transparency rules that would make it possible for rights holders to actually enforce that position.

In other words: the law says you need permission. But nobody has given you the tools to find out whether permission was sought.

What the Report Quietly Signals

Read between the lines, and the direction becomes clearer than the government would like to admit. The Report makes several forward-looking commitments that tell you where this is heading.

First, transparency is coming. The government has signalled that it expects UK transparency rules to be at least equivalent to the training data disclosure requirements under Article 53 of the EU AI Act. That is not a small commitment. It means AI developers marketing models in the UK will need to disclose information about training data sources and the jurisdictions in which training occurred. The only question is when, not whether.

Second, the government has expressed support for abolishing copyright protection for purely computer-generated works under Section 9(3) of the Copyright, Designs and Patents Act 1988. This is a significant signal. If you are relying on AI-generated content to hold value in your business, the legal basis for protecting it as intellectual property is about to weaken.

Third, a new digital personality right is on the table. The government has acknowledged a growing gap in UK law when it comes to deepfakes and unauthorised digital replicas. A summer 2026 consultation on a new digital replica or personality right has been announced. For businesses in media, entertainment, or any sector using AI-generated likenesses, this is one to watch closely.

The International Context Makes This Worse

The UK is not deliberating in a vacuum. While Westminster keeps thinking, the rest of the world is (kind of) acting.

The EU AI Act's high-risk AI obligations take full effect on 2 August 2026. Article 53 already requires transparency about training data for general-purpose AI models. If your business sells into EU markets, the compliance clock is already ticking, regardless of what the UK decides to do about copyright.

In the United States, there are now over 70 AI-related copyright cases working through the federal courts. In January 2026, a US court ordered OpenAI to produce all 20 million of its output logs in one case alone. The Thomson Reuters v Ross Intelligence decision has confirmed that using copyrighted material to train AI tools is not automatically fair use. And a landmark $1.5 billion settlement in the Bartz v Anthropic litigation in 2025 sent a powerful message about the financial stakes.

In Germany, the Munich Regional Court held that reproducing copyrighted song lyrics via ChatGPT could amount to infringement, in a case brought by GEMA, the German collecting society. That ruling sets the EU on a potentially diverging path from the UK on these issues.

And in the UK itself, Getty Images has been granted permission to appeal its secondary copyright infringement claim against Stability AI. The appeal will test whether an AI model can constitute an "infringing copy" under UK law. The Court of Appeal has not yet scheduled the hearing, but when it does, the outcome could reshape the landscape far more decisively than anything in the government's report.


What This Means for Your Business

The government's inaction does not mean you should do nothing. Quite the opposite. Regulatory silence is not the same as regulatory safety. When the rules do arrive, and they will, businesses that have been operating without internal frameworks will be the most exposed.

If you are a business that uses AI tools, here is what you should be doing now:

  • Audit your AI tools. Know what you are using, what data those tools were trained on, and whether your provider offers warranties about the legality of their training data. If your contract with an AI vendor does not address IP indemnity, it is incomplete.

  • Review your contracts. Check whether your commercial agreements address AI-generated content, copyright ownership, and liability for infringement. Many contracts written before 2024 do not. That gap is a risk.

  • Write an AI use policy. If your staff are using generative AI in their work, whether for drafting, design, research, or content production, you need a written policy governing how and when AI-generated material can be used in commercial outputs. The EU AI Act will require AI literacy training for all staff in organisations deploying AI systems. Start now.

  • Protect your own IP. If you hold valuable content, images, data, or creative works, consider whether web crawlers are accessing your material. Tools such as robots.txt, the TDMRep Protocol, metadata watermarking, and direct licensing agreements are all available. Do not wait for the government to mandate their use.

  • Prepare for transparency. If you develop or deploy AI models, data provenance is no longer optional. Whether or not the UK introduces mandatory disclosure before the end of 2026, the EU already requires it and international best practice is heading firmly in that direction. Build your records now.

  • Label AI-generated content. The government has indicated support for labelling AI outputs, and a working group has been established. Even in the absence of a statutory requirement, labelling is becoming a baseline expectation for responsible AI deployment. Getting ahead of that curve is cheaper than falling behind it.


The Bigger Picture

There is a temptation to read this report as a win for the creative industries, because the broad data mining exception they feared has been shelved. And in a narrow sense, that is true. But a win that comes with continued uncertainty and no enforcement tools is not much of a win at all.

The creative industries contribute £146 billion in gross value added to the UK economy. The AI sector contributes approximately £12 billion and is growing rapidly. Both are identified as strategic priorities in the government's Industrial Strategy. Trying to please both without committing to either is a strategy that pleases neither.

One in three UK AI startups is reportedly considering relocating, citing the regulatory environment among their reasons. Rights holders still cannot reliably discover whether their works have been used to train AI models. And the government's preferred mechanism for resolving licensing disputes is a pilot marketplace called the Creative Content Exchange, which is not expected to launch until summer 2026.

The UK has positioned itself as an AI-friendly jurisdiction. That is a credible ambition. But ambition without legal clarity is just optimism. And optimism, as a regulatory strategy, has a limited shelf life.


What Comes Next

The government has said it will not legislate until it is confident reforms will meet its economic objectives. It has committed to monitoring international developments, watching the Getty v Stability AI appeal, tracking the EU's transparency regime, and continuing to engage with stakeholders.

Translation: expect further consultation, probably in the second half of 2026, possibly attached to the anticipated cross-economy AI Bill that several parliamentarians have been calling for.

In the meantime, the law has not changed. Copyright still applies. Licences are still required. Enforcement is still difficult. And the question of whether AI models trained on copyrighted material constitute infringing copies under UK law is heading to the Court of Appeal.

Governments legislate slowly. Courts move at their own pace. Risk does not wait for either.

If you want to understand how this affects your business, or if you need to review your AI contracts, policies, or IP exposure, get in touch. RMOK Legal helps businesses build AI governance frameworks that work now, not whenever Parliament gets round to it.

Frequently Asked Questions

  • No. The report introduces no new legislation and proposes no immediate reforms. The current legal position remains: using copyrighted works for commercial AI training in the UK requires a licence unless a narrow statutory exception applies. The government has committed to further evidence gathering before making any changes.

  • Only with a licence from the copyright holder, or under one of the limited existing statutory exceptions such as non-commercial research. The government's report confirms that the status quo remains and that no broad data mining exception will be introduced.

  • Audit your AI tools and their training data provenance. Review commercial contracts for IP warranties and indemnities. Write an AI use policy for staff. If you hold valuable IP, use technical tools to monitor web crawler access. Prepare for transparency requirements aligned with the EU AI Act's Article 53 disclosure obligations.

  • The EU has moved ahead with binding transparency obligations for AI developers under Article 53 of the EU AI Act. The UK has indicated its transparency requirements will be at least equivalent, but has not yet legislated. UK businesses selling into EU markets must comply with the EU framework regardless of UK domestic reform timelines.

  • Getty Images has been granted permission to appeal the High Court's dismissal of its secondary copyright infringement claim against Stability AI. The appeal will test whether an AI model can constitute an "infringing copy" under UK copyright law. The outcome could have significant implications for how copyright applies to AI models trained on protected works.

  • The government has acknowledged a gap in UK law regarding unauthorised digital replicas and has committed to launching a consultation in summer 2026 on whether to introduce a new digital replica or personality right. Businesses using AI-generated likenesses should monitor this closely.

  • The government has expressed support for abolishing the existing copyright protection for purely computer-generated works under Section 9(3) of the Copyright, Designs and Patents Act 1988. Works created with AI assistance, where human creativity remains central, would continue to be protected. No timetable for repeal has been set.

Next
Next

Two Years In, One Big Question: Can a Solo Practice Win National Awards?