FAST Legal Update Member Bulletin – September 2024

King’s Speech 2024 Update

The King’s Speech[1] on 17 July, among other things addressed developments in AI, cyber security, data protection and introduced two new Bills: (1)the Cyber Security and Resilience Bill, and (2) the Digital Information and Smart Data Bill. Neither of the Bills have been published yet.

Although an ‘AI Bill’ was mentioned in the draft King’s Speech, this was not referenced on the 17th July. There was however a general statement that the Government will seek to consider appropriate legislation to regulate the most powerful AI models. This marks a significant change from the approach adopted by the previous administration which was principles-based and sought to rely upon regulators to regulate on a sector by sector basis.

Cyber Security and Resilience Bill

The Cyber Security and Resilience Bill was announced to strengthen the UK’s defences and ensure that the most essential digital services are secure and protected. This came following an increase in attacks on the UK’s digital economy (see our note on the NHS attack in this update; noting that this follows a trend following ransom attacks on the British Library and Royal Mail).

Key points to note

The Bill seeks to grow the remit of both existing UK regulations, and powers given to UK regulators. A key focus is the increased protection of digital services and supply chains which have been the subjects of attacks. Regulators will have more resources to investigate potential vulnerabilities, funded in part by cost recovery mechanisms.

The Bill also mandates increased incident reporting, with the aim of providing the Government with better data on cyberattacks. Regulated entities will therefore be required to report on a wider breadth of incidents.

The Government have made these changes with the rate of regulatory change in the EU in mind, in particular due to the ongoing move to Network and Information Security 2 (‘NIS2’) in the EU. Current UK Network and Information Security (NIS) Regulations 2018 regulations were inherited from the EU and have since evolved. As such, the UK look to be updating their regulatory framework to match the pace of the EU. Despite this aim, the UK Government’s response paper to a consultation on reforms to the NIS regulations was provided well over a year before the announcement of the Bill. It remains to be seen if the new administration will pick up the pace of change.

Key takeaways

Cyberattacks pose an obvious risk to public services and infrastructure, and can take many forms, including attacks by cyber criminals and state actors. This presents a significant risk to the economy and therefore a driver for the Government to establish effective measures given the cost of cyberattacks to the UK economy. The Bill is a welcome step in the UK’s efforts to match the pace of change in the cybersecurity space. Given the continued advancement of technology, the widening landscape of threats, and the growing capabilities of China and Russia, the UK will need to continue to navigate the cyber sector effectively in order to provide the best protection for its individuals, businesses and public sector bodies.

However, in creating further standards and increasing expectations on businesses through burdensome information sharing requirements, there will be a need for businesses to invest in complying with standards and matching the administrative burden set upon them. This is recognised by the anticipated provision of resources covering how to improve cybersecurity practices, but the changes are likely to result in a net rise in the costs which arise from cyber incidents, and in the increased uptake of cyber insurance and risk management services.

Digital Information and Smart Data Bill

The Digital Information and Smart Data Bill (DISD Bill) focuses on harnessing the power of data to drive economic growth.

Key points to note

The Bill establishes digital verification services which, by way of illustration, will help with pre-employment checks, purchasing age-restricted goods/services and the process of moving house. These services are currently used for credit checks in retail banking, and are viewed as having potential for growth into other areas. Additionally, a National Underground Asset Register is going to be created, which will act as a digital map of underground pipes and cables. This is another example of ‘smart data’, where a stronger pool of data will provide efficiencies for planners and excavators.

On the theme of sharing data, the Bill proposed making data sharing activities easier in two ways. Firstly, the Bill provides for the creation of Smart Data schemes, which allow for customers’ data to be shared with authorised third-party providers on request. The Government sees Open Banking as a comparable regime to a Smart Data scheme, where you would share your data with authorised third parties (such as financial management apps) which help to manage all of your accounts in one place. Secondly, changes to the Digital Economy Act have been proposed, which will help the Government in sharing data relating to businesses which use public services.

The Bill also allows scientists to ask for broad consent for scientific research so that they are able to use more data for their research.

Importantly, the Bill also proposes to modernise and strengthen the role of the Information Commissioner’s Office (ICO) by awarding the regulator with greater powers. The reforms have the goal of creating a modern regulatory structure, seeking to appoint a CEO, board and chair of the ICO. The changes to the ICO will be alongside reforms to data laws which seek to clarify areas which may impede the safe development of new technologies.

Key takeaways

Sound familiar? Many of these proposals and principles were not entirely new and instead are heavily copied over from the Data Protection and Digital Information Bill (DPDI Bill), which failed to pass in the midst of the change in Government in July. However, some points proposed in the DPDI Bill have not made it into the Digital Information and Smart Data Bill (DISD Bill); for example, the use of legitimate interest hasn’t been expanded and the definition of personal data is not being changed by this Bill. While some may therefore see the DISD Bill as being less ambitious, the narrowed scope of changes does allay concerns around the threat that reform and a weakening of data protection rights could pose to the renewal of the EU/UK adequacy decision. The adequacy decision allows for simpler transfers of data between the EU and the UK and is up for renewal in June 2025.

The general focus on opportunities and growth around digital information will be welcomed. Alongside the focus on Open Banking as an area of regulatory investment within the DISD Bill, many businesses in the financial services will benefit from the changes around digital verification services which will allow for modernised and streamlined due diligence checks, and therefore a better experience for customers.

Make sure to keep an eye on both the Cyber Security and Resilience Bill and the Digital Information and Smart Data Bill once more detail emerges.

Case Law Update

Comptroller – General of Patents, Designs and Trade Marks v Emotional Perception AI Ltd[2]

Comptroller General of Patents, Designs and Trade Marks v Emotional Perception AI Ltd [2024] EWCA Civ 825 (19 July 2024) (bailii.org)

There has been an interesting development to the High Court ruling described in our January Legal Update. The Court of Appeal has held that an AI invention which uses an artificial neural network to suggest music files to users is unpatentable.

Brief background

In a judgment delivered in November 2023, the High Court ruled for the first time on whether an AI invention which used an artificial neural network (ANN) could qualify for a patent. In summary, Sir Anthony Mann’s ruling that the invention did not trigger the statutory exclusion (and could therefore be patented) has been overturned by the Court of Appeal. By reversing the judgment, the Court of Appeal has upheld the UK Intellectual Property Office’s (UKIPO) original decision to reject Emotional Perception AI Ltd’s application for a patent.

As a brief reminder of the facts we outlined in the January newsletter, Emotional Perception AI Ltd developed an AI application that recommended music to end users through use of a trained ANN. The ANN chooses music for users based on human perceptions and descriptions, and uses machine learning to develop suggestions without human input. Emotional Perception AI Ltd’s initial application to the UKIPO for a patent for the invention was refused on the grounds that the application was deemed to be a “program for a computer”, which are not patentable under the Patents Act 1977. The High Court disagreed with the UKIPO’s interpretation and ruled the AI to be patentable. The Court of Appeal has now upheld the UKIPO’s original decision for the reasons set out below.

Key considerations

Firstly, the Court considered whether the invention was a “computer” or a “program for a computer” and recognised that if it was the latter, the statutory exclusion would apply and the invention would be unpatentable (unless deemed to have made a technical contribution). Birss LJ commented that a computer is a “machine which processes information”, and a program for a computer is a “set of instructions which cause the machine to process the information in that particular way”. Birss LJ went onto say that an ANN is “clearly a computer” as it is a “machine for processing information” and that the “weights and biases” of an ANN are a “set of instructions for a computer to do something”; he therefore categorised the invention as a program for a computer.  

The Court then considered whether the invention made a technical contribution. If so, the application could not be excluded from being awarded a patent just for being classed as a program for a computer. Birss LJ held that there was no technical contribution, as the process of training the ANN was “part of the creation of the program” and the “similarity of the files which gives rise to the their recommendation is … semantic in nature and not technical.”

Key takeaways

Firstly, there was a considerable amount of discussion in the case around the definitions of “computer” and “program for a computer”. This highlights the fact that there is not a clear understanding of these concepts and how they are to be interpreted under the Patents Act 1977. With  the evolution of computers since 1977, the Courts are finding it difficult to apply the wording of the statute to evolving technologies.

Secondly, the Court’s view that training the ANN formed part of the creation of the computer program, and was therefore not a technical contribution, poses questions on the patentability of generative AI. Training the technology forms a key part of the development of AI technologies and the content that it produces, and developers will therefore need to be aware of this judgment when applying for patents for similar machine-based learning inventions, to avoid falling into the category of a “program for a computer” which may remove the invention’s patentability.

Following the judgment, Emotional Perception AI Ltd has suggested that it is going to appeal the Court of Appeal decision in the Supreme Court. We will keep a close eye on further developments.

Global Cybersecurity Malfunction and Continued Cyberattacks

Cybersecurity has continued to dominate headlines as a global cybersecurity provider caused a mass IT outage, and a major cyberattack on the NHS left thousands of GP practices and hospitals without access to patient records. 

CrowdStrike Global IT Outage

On 19 July 2024, a coding error in an update to CrowdStrike’s Falcon software caused a global IT outage and left many businesses including major banks, airlines and supermarkets without access to their computer systems.

CrowdStrike is an American cybersecurity company which helps businesses detect and prevent security breaches. At the time of the system update failure, the company had roughly 30,000 customers.

The IT crash was caused by a faulty update to the Falcon software, which resulted in malfunctions to other types of software which interacted with Falcon, for example Microsoft Windows products. 

The incident is a significant reminder for businesses and cybersecurity providers to review their internal measures and consider how best to reduce the impact of such outages which will inevitably occur in the future.

NHS cyberattack

In June, a ransomware attack carried out by a Russian cyber-criminal group caused major problems for the NHS, as over 3,000 GP surgeries and hospitals were left unable to access patient records.

Synnovis, a public-private joint venture partnership between Guy’s and St Thomas’ NHS Foundation Trust, King’s College Hospitals NHS Trust and diagnostics business SYNLAB, was the victim of the attack, in which the hackers installed malware into Synnovis’ software. The malware locked the computers until a ransom fee was paid. Masses of personal information was then downloaded by the cyber-criminal group and on 3 June 2024, 400GB of personal data was published online.

The incident was shortly led by ministers proposing the Cyber Security and Resilience Bill, which seeks to protect and support public bodies like the NHS. As mentioned in the ‘King’s Speech 2024 Update’ above, this Bill will require private companies which supply digital services to public bodies to strengthen their cyber safeguards. With the healthcare sector being reported as a “big target”[3] for cyber-criminals, Sir Keir Starmer is keen to focus on toughening cybersecurity in this sector as a priority.

DSIT call for evidence: cyber security of AI

On 15 May 2024, the Department for Science, Innovation and Technology called for views on its ‘Code of Practice for Software Vendors’ and ‘Code of Practice for AI Cybersecurity’. The call for views has now closed.

The code of practice for software vendors targets all organisations that develop or sell software to B2B organisations. It includes fundamental security and resilience measures which will be reasonably expected of these organisations, focusing on voluntary compliance with various principles.

The call for views on the cybersecurity of AI considers a proposed voluntary code of practice which focuses on secure design, development, deployment and maintenance for AI systems. The call for views focuses on risks to AI models and technology, not the risks coming from AI. It is based on the National Cyber Security Centre (NCSC) ‘Guidelines for Secure AI System Development’, setting out the potential for voluntary compliance with various principles and setting out how practical steps can be taken. This voluntary code is then intended to develop into a Global Standard for AI Models, with DSIT stating that the UK Government will continue to promote a global approach to cybersecurity.

Cybersecurity is outlined as a focus by the UK Government to ensure the use of AI is safe, and therefore, the UK Government has stated that they will endeavour to support developers of AI systems in addressing cybersecurity risks to those systems. There is also a focus on deployers of AI systems, given that DSIT found in their initial assessment that nearly half of organisations who use AI don’t have specific AI cybersecurity processes or practices in place.[4]

We will keep an eye on any further published material from DSIT, particularly as next steps and key themes will be published by DSIT. A Global Standard may then be pursued to relating to the cybersecurity of AI, which would be a key step towards global consistency in technical standards and requirements.

EU AI Act enters into force

AI Act into force

The EU AI Act was published in the Official Journal of the European Union on 12 July 2024, and entered into force on 1 August 2024. This means that organisations within the scope of the AI Act are now mandated to prepare for compliance, depending on their defined role under the EU AI Act (e.g. providers of general-purpose AI (GPAI) systems).

As you may be aware, the EU AI Act establishes a comprehensive regulatory framework for AI development and deployment across the EU, emphasising risk-based classification and protecting fundamental human rights. It mandates clear requirements for AI systems based on their risk levels and bans AI practices like social scoring.

The EU AI Act has an extraterritorial effect, so UK organisations may fall in scope of the Act in certain conditions (e.g. where a business decides to put an AI system into the EU market).

Code of Practice for GPAI Models

The European AI Office have invited various AI industry organisations, academics, independent experts and others to express their interest to participate in the drawing-up of the Code of Practice, with invitations to express interest being submitted before the passed deadline of 25 August.

A consultation on a Code of Practice for providers of general-purpose Artificial Intelligence models will feed into the Commission’s upcoming draft of the Code of Practice on GPAI models. This may lead to shift in focuses for organisations, with best practice signifying areas of focus for organisations in order to demonstrate compliance with the EU AI Act obligations.

Next steps

As outlined in previous issues, organisations should determine whether they fall in scope of the EU AI Act, and if so, which deadlines for compliance are applicable to them. Completing this assessment will provide timelines for organisations to consider the appropriate steps to take regarding governance, training and risk assessments. In particular, the use of any AI systems which are likely to be deemed as ‘high risk’ should be checked against the EU AI Act to see if the use of this is subject to obligations with a shorter compliance period.

We expect to see further activity from the European Commission over the court of the next year, with both codes of practice being released as well as guidance to be published on other aspects of EU AI Act compliance. The Commission also has the power to issue delegated acts that impose additional requirements, so despite the Act entering into force, there may still be more change to come in the medium term.

It should be noted that UK organisations may see similar obligations placed upon them by the UK Government, with regulation for the most powerful AI models being recently referenced by the Labour Government in the King’s Speech in July.


[1] The King’s Speech 2024 – GOV.UK (www.gov.uk)

[2] Comptroller General of Patents, Designs and Trade Marks v Emotional Perception AI Ltd [2024] EWCA Civ 825 (19 July 2024) (bailii.org)

[3] NHS hack prompts tougher UK cyber security rules for private providers (ft.com)

[4] Call for views on the Cyber Security of AI – GOV.UK (www.gov.uk)