Skip to Main Content

AI copyright rules: How the push for productivity poses awkward questions

Proposals outlined by government could enable AI companies to use creative works to train algorithms, bypassing existing copyright laws.


Author: Markel UK

4 minute read

The Government’s plan to allow artificial intelligence (AI) companies to train algorithms, using the works of the creative industries, is designed to accelerate the development of AI as the UK turns to technology in its quest for economic growth.

The move is not without controversy. The Government consultation includes four options, ranging from working with existing laws to effectively granting tech firms a text and data mining (TDM) exception to the UK’s copyright laws, unless creative professionals or firms opt out of the process. The concept has been the subject of a consultation, which ended in February, and a government response is expected in due course.

TechUK, the UK’s technology sector trade association, is a strong supporter of the move. “The Government’s consultation paper is clear that the current AI copyright status quo benefits neither the UK tech companies nor the creative sector,” says Antony Walker, deputy CEO. “We believe that the option of a broader TDM exception with an opt-out mechanism could provide a constructive way forward, bringing the UK into line with the EU and other major economies around the world.   

“Failure to resolve this issue will simply mean that innovation in generative AI takes place outside the UK with no additional benefit for creative companies. The Government is right to seek a better outcome for the UK and should be commended for bringing forward a consultation on this complex and contested issue.”

Article highlights      

  • Government's AI plan aims for economic growth.
  • Consultation includes controversial TDM exception.
  • TechUK supports broader TDM exception.
  • Creators' Rights Alliance pushes for opt-in system.

Protecting creators

Not surprisingly, those in the creative sector take a different view. The Creators’ Rights Alliance is pushing for an opt-in system, and stricter enforcement of existing copyright laws. “It is important that policymakers and developers ensure that any implementation of AI and the use of machine learning acknowledges the huge contribution our creators make to our creative and financial economy, as well as our cultural wellbeing, and in doing so provide them with robust protections,” it said in a statement.

Those in the technology sector, however, believe this could be hugely beneficial to the UK’s quest to become an AI leader. Roman Eloshvili is founder of ComplyControl, an AI-powered compliance and fraud detection start-up for financial institutions. He believes the UK could become a global AI hub, should the proposals be adopted.

“It’s unarguable that competitiveness in AI development will be increased,” he says. “The creation of an innovation-friendly environment will ensure the inflow of new investments which may potentially turn the sector into a global one.  

“It will also positively impact other spheres where AI is widely used, such as healthcare and finance. That means it will also boost particular sectors’ growth, which will have a positive effect on the country’s overall incomes.”

This is being considered a gamechanger for tech firms, as they will not be required to negotiate multiple licences

Competing globally

Matthew Oldham is founder of BipBapBop, a children’s educational site that aims to become the ‘Wiki for kids’. He believes the move would remove “legislative shackles”, enabling firms to compete in a global marketplace against foreign players with more liberal copyright laws.

“While governments can and should continue to enforce copyright on unique work, having an AI model train on a text or an image should be viewed no differently to having a human read or view the art,” he claims. “In both cases, the art could inspire the creation of a different work. This has opened numerous possibilities for creative businesses if they are able to adopt the new technology and use it to produce great customer experiences.”

Fergal Glynn, chief marketing officer and AI security advocate at AI testing firm Mindgard, agrees the move would be transformational. “This is being considered a gamechanger for tech firms, as they will not be required to negotiate multiple licences but will have legal clarity and easy access to training data,” he says. “This will boost AI development and attract more investors.”

Others want the current proposals to go further still. “To my mind the opt-out that they are offering content creators – and that’s much broader than the creative sector we tend to focus on – is too much,” says Amanda Brock, CEO of OpenUK, a UK-based not-for-profit company which supports open-source collaboration and open technologies.

“It’s an unworkable attempt to please everyone. Currently the UK position is different from that in many other countries and certainly in the ones that we are competing with on the AI front. They have fair use or other exceptions for text and data mining, enabling AI training, and we need the same type of simple solution here.

“The right solution will enable the UK’s digital economy and evolution of AI technologies. The wrong solution will slow things down in the UK and not be good for anyone in the long run.”