Author

Stephen Lynch, journalist

The government has launched a consultation on a new copyright framework for anyone who creates content and intellectual property (IP).

Under the proposals, artificial intelligence (AI) developers will be free to use creators’ content on the internet to help develop their AI models, unless rights holders elect to opt out. The main objective is to give creators better control over their content (including the ability to be remunerated if AI developers want to use it to train their models).

The scope of the consultation is focused on the creative sector, but it has important implications for anyone who creates content – for example, copywriting for website blog posts or practitioners’ client newsletters.

Many SMEs do not necessarily consider that they are creators

Knowledge gap

The implications are wide; here, we focus on the support practitioners can offer their content- and IP-creating clients; how practitioners can mitigate the impact AI will have on their own work; and how they can limit risks for clients who use AI tools more generally.

Many SMEs will be unaware of the proposed changes, and do not necessarily consider that they are creators who can benefit from a level playing field.

‘Under the current proposals, AI firms could use others’ content to train their models unless businesses actively opt out,’ says Nick Robinson, founder of Yorkshire Accountancy. ‘Most small business owners won’t even know this is happening, let alone understand the process to stop it.’

Understand intent

Complying with a new regime from day one is usually difficult. The first few court judgments covering the new legislation tend to be informative; without these, content creators and their advisers won’t have much to go on, beyond the black letter of the law.

‘This might mean creators working out which fights are actually worth having’

But understanding the intention of any new legislation is key to being able to comply with it. If the government moves its position to being more protective of content creators, then, says William Miles, partner at specialist IP law firm Briffa, ‘Advisers will need to understand the length and breadth of those protections while helping their clients understand when and how to enforce their rights.

‘This might mean creators taking a selective approach with infringers and working out which fights are actually worth having.’

What is ownership of content?

Law in the UK currently follows the EU test of ‘author’s own intellectual creation’. Under this test, in order to constitute an original work, the content needs to reflect ‘human personality’, result from ‘free and creative choices’ and the ‘author’s personal touch’, and not be dictated by technical considerations, rules or other constraints. Nevertheless, the bar for originality is still low and it has been envisioned that an 11-word snippet from a newspaper article could constitute a protectable copyright work.

Source: A&O Shearman

Advisers need to ensure that clients are in compliance with how they manage their IP. They can also help them understand and negotiate any licensing agreements that AI developers want to reach with them to use their content. This support includes accounting for client revenues that come from the formal licensing of their IP, or from retrospective settlements for AI-generated outputs that incorporate their content, so that they can be paid fairly for their work.

Heal thyself

Accountancy practitioners themselves also need to consider how to mitigate the impacts on their own work.

Using digital watermarks as unique identifiers can detect and trace wider usage

‘If AI can replicate your writing style or insights, will clients still pay for your expertise in the same way?’ asks Robinson. ‘Small business owners should be thinking about how their unique content remains profitable and protected.’

For anyone producing valuable content, Robinson advises the following: 

  • Find out whether your work is at risk. If you produce written, visual or other creative content, you should understand what’s being used and how. This means tracking whether AI models have already used your work for training or for their own content generation. It also involves checking published AI datasets and using AI-detection tools.
  • Use AI developers’ opt-out mechanisms to protect your valuable content from being used in training datasets. Alternatively, using digital watermarks as unique identifiers can detect and trace wider usage. Technological safeguards like robots.txt rules and server-side protections can prevent AI web crawlers from scraping content, while some creators use paywalls to control access.
  • Consider monetisation as viable revenue. If AI firms do use your work, could you negotiate licensing deals? This could involve industry-led agreements (eg where developers pay a subscription to access datasets) or collective licensing models where rights holders share their content under a central body.
Mitigating risk

There will also be risks arising from a new copyright regime as AI becomes more integrated into creative work in publishing, music, film and television. Accountants working with clients facing the AI sector generally have roles in due diligence to evaluate which tools are ethical in how they use approved, licensed datasets.

This due diligence involves checking if the developer has been involved in copyright infringement legal disputes; if they disclose their training material and relevant permissions; and if the terms of use protect and indemnify the business from legal liability for copyright infringement on the part of the AI model.

Advisers can help clients budget for licensing fees or legal settlements

This also means developing risk-management strategies to protect financial interests as AI and copyright regulations evolve in the UK and overseas.

Advisers can help clients budget accurately for complying with retroactive licensing fees or legal settlements, for example. Other strategies may involve insuring against IP-related legal risks, diversifying AI tool providers to reduce exposure, and building financial reserves for adapting to likely regulatory changes.

Fine balance

Legislators are in the unenviable position of having to navigate international relationships with the US and EU, and keep pace with advances in the technology. The UK government could strike the wrong balance between AI innovation and IP protection, with bad regulation undermining one of the strongest economic sectors at a time when it needs supporting.

‘Too much legislation could protect the current creators but stifle future competition; too little would devalue creative work to the point that it will no longer be made,’ says Miles. ‘Finding the balance is the key to growing the economy and protecting what the UK does best.’

More information

See ACCA’s AI monitor series of reports

Advertisement