Subscribe

Keep Up to Date with the Most Important News

By pressing the Subscribe button, you confirm that you have read and are agreeing to our Privacy Policy and Terms of Service

UK Can Lead AI and Copyright Debate, TBI Says

UK Can Lead AI and Copyright Debate, TBI Says UK Can Lead AI and Copyright Debate, TBI Says
IMAGE CREDITS: TELEGRAPH

The Tony Blair Institute (TBI) has released a new report urging the UK to take global leadership in shaping the future of AI and copyright in the UK. Titled Rebooting Copyright: How the UK Can Be a Global Leader in the Arts and AI. The report argues that the country is well-placed to define the intersection of creativity and artificial intelligence.

According to TBI, we’re entering a new era of media transformation. AI is changing how content—text, images, and sound—is created, distributed, and experienced. Much like the printing press or the camera reshaped earlier generations, today’s AI tools are driving a shift in how we understand originality and authorship.

Rather than replacing human creativity, the report says AI will unlock new ways to be original. It forecasts a future filled with interactive, bespoke works—and also a revival of artistic movements that push back against what AI can’t replicate.

The report makes it clear that AI’s impact isn’t confined to the creative industries. It’s already being used to assist scientific research, speed up medical diagnostics, and support emergency services during natural disasters. These advances, TBI argues, are only the beginning. As computing power grows and AI systems improve, the ripple effects will spread across every industry.

The UK government has already signaled its intention to lead. Prime Minister Keir Starmer’s AI Opportunities Action Plan, announced in January 2025, outlines national ambitions in the space. TBI welcomes this move, saying that AI—if well-designed—can make life safer, healthier, and more prosperous.

Still, rapid growth raises tough questions. A central issue is how UK copyright law applies to AI model training. The current debate is often framed as a zero-sum game: rights holders versus developers. TBI argues that this framing oversimplifies what’s at stake and risks missing the broader opportunity.

The report calls for bold policy reform that gives legal clarity to all stakeholders. Without it, TBI warns, the UK risks losing momentum in AI innovation, job creation, and investment. The paper draws parallels to past technology shifts—like the rise of the internet—which sparked initial resistance but eventually led to new standards, markets, and opportunities for creators.

To respond to AI’s impact on creative industries, the UK government has proposed a text and data mining exception with an opt-out option for rights holders. TBI sees this as a step forward but warns of serious challenges in how it would work. These range from legal ambiguity and technical limitations to broader geopolitical implications.

The Institute outlines a framework for how to implement this opt-out scheme effectively. It explores how to make opt-outs visible and enforceable, and how to deal with the spread of AI-generated summaries that might misrepresent a creator’s original voice or style. Licensing systems, digital watermarking, and defensive AI tools are among the mechanisms discussed.

To further guide policy, the report proposes the creation of a Centre for AI and the Creative Industries. This centre would set standards, support research, and offer recommendations on copyright in an AI-driven world. Funding would come from a targeted levy on internet service providers (ISPs), a proposal that has already sparked debate.

Not everyone agrees with TBI’s direction. Ed Newton-Rex, CEO of Fairly Trained, posted strong criticisms on Bluesky. He disputes the report’s claim that UK copyright law is uncertain. According to him, the law already requires licensing for AI training, and an opt-out model would reduce creators’ control—since many won’t opt out in time.

He also rejects the idea that machine learning is similar to human learning, calling that comparison flawed. Newton-Rex highlights the profits made by companies like OpenAI to counter the claim that developers won’t benefit long-term from training on copyrighted works. He accuses the report of using strawman arguments and ignoring data on how generative AI reduces demand for human creative labour.

Further criticism centers on the proposed academic centre. Newton-Rex argues that it wasn’t requested by creators and would be funded by the public rather than AI companies. The revenue, he says, wouldn’t even go to artists directly.

Author Jonathan Coe added to the backlash, pointing out that the report’s five co-authors come from science and tech backgrounds. None are practicing artists or creatives.

Despite the pushback, TBI insists the report aims to spark constructive debate. It believes that UK copyright law must evolve to reflect the reality of generative AI. And it argues that seizing leadership in this space isn’t just about law—it’s about defining a future where creators and AI can thrive together.

As the government moves forward with its AI policy agenda, the conversation around copyright, training data, and artistic integrity is set to become even more urgent. The UK now faces a choice: adapt and lead, or fall behind.

Share with others