Case Law Update
Parker-Grennan v Camelot UK Lotteries Limited[1]
The Court of Appeal has delivered its verdict, affirming the High Court’s decision at first instance and bringing the National Lottery prize claim to a close.
Brief background
Ms Parker-Grennan purchased a £5 ticket to play an interactive instant win game on the National Lottery website, which gives players the chance to win a cash prize ranging from £5 to £1 million. Ms Parker-Grennan accepted Camelot’s terms and conditions, which were available through various hyperlinks and drop-down menus via the “click-wrap” procedure when creating her online account. Where any changes were made, she would have to accept these on any subsequent log in to her account. After purchasing her ticket and initiating an online game, the animations suggested that she had won the £1 million jackpot – animations that were later revealed to have been the result of a rather unfortunate Java coding error. This brought little comfort to Ms Parker-Grennan when Camelot refused to pay. She then brought a claim against them. She argued that the terms on which Camelot sought to rely were not incorporated into the contract and were unfair under the Unfair Terms in Consumer Contracts Regulations 1999 (‘the UTCCR’) and applied for summary judgment. The Judge dismissed the application and Parker-Grennan appealed to the Court of Appeal.
The appeal raised three issues:
- Whether Camelot’s terms were properly incorporated into the contract?
- Where the terms unenforceable by virtue of the UTCCR?
- As a matter of construction of the contract, did the appellate win £10 or £1 million?
The judgment in favour of Camelot was unanimous. The terms had been effectively incorporated as Camelot had taken reasonable steps to sufficiently bring the terms to Parker-Grennan’s attention, with the court commenting that a trader “cannot force someone to read the terms and conditions if they cannot be troubled to do so”. On the enforceability issue, the court found that none of the terms were onerous or unusual and consequently there was no need to specifically signpost them in order to incorporate them. Finally, turning to the issue of construction, the court concluded that on a proper interpretation, Parker-Grennan had won £10, not £1 million: the prize amount was predetermined via Camelot’s random number generator at the point of ticket purchase, and the terms of the game were clear on what would constitute a win (and would have been obvious to any reasonable player, whether or not they had read the specific game procedure).
Key takeaways
Where does this put consumers and traders alike? For now at least, the status quo remains unchanged – but it does serve as a reminder of the importance of properly drafted terms and conditions, and care to ensure that these are shared with customers in a way that is (1) easily understood and clearly drafted, (2) well sign-posted and (3) sensibly presented. It is somewhat of a tightrope balancing act – ensuring that terms are sufficiently brought to the attention of customers, without impacting the user experience so much that they simply quit the relevant website. It’s not a one-size fits-all approach, as there is no one prescribed way of creating an online contract, however what is appropriate will very much be dependent on the circumstances, and whether reasonable steps have been taken to bring terms sufficiently to a customer’s attention are a matter of fact. Indeed, the court remarked that there may be circumstances where a drop-down list or a hyperlink may not be sufficient – although in this scenario, they were.
Generative AI in litigation
Earlier this year, a Canadian lawyer landed themselves in hot water after it became apparent that an intended notice of application in a family law case included citations to two fictitious cases cooked up by Chat GPT.[2] Although, in what is perhaps a more surprising twist, this is not the first time a lawyer has been admonished by the courts as the result of an ‘AI hallucination’. In a similar and not-so-isolated incident, Michael Cohen, former attorney and ‘fixer’ for Donald Trump, submitted a number of fake cases to his lawyer to include in support of a motion to end Mr Cohen’s supervised release early. Cases which were an AI-generated fiction were then submitted to the court.[3]
Perhaps the most well-known incident in this context, however, is the now-infamous case of Mata v. Avianca. In June 2023, two New York attorneys were sanctioned by a U.S. federal judge for submitting filings that contained numerous fictitious cases complete with “bogus judicial decisions, with bogus quotes and bogus internal citations”[4] in a client’s personal injury claim against the Colombian-based airline Avianca. Lawyers of Levidow, Levidow & Oberman used Chat GPT to conduct legal research for the filed briefing, taking the chat bot’s assurances that the cases were genuine at face value, without engaging in any further due diligence. The blunder came to light when Avianca’s lawyers queried the authenticity of the cases and wrote to the presiding judge who then raised his own inquiries. To add to the professional embarrassment, the sanctions order imposed had the lawyers contact every real-life judge to which the bogus quotations and rulings were attributed, to inform them of the sanction[5].
Generative AI tools have some way to go before they can replace the legal expertise of practicing legal professionals. That being said, it is notable that the outputs, at least at first blush, have been convincing enough to fool a number of unwitting lawyers. In response to the rise in the use of AI tools, the US District Court for the Eastern District of Texas has been one of the first to adopt a rule that requires lawyers to certify the accuracy of any AI-generated content, with the federal US Court of Appeals for the Fifth Circuit considering similar action.[6] It remains to be seen whether other jurisdictions will follow suit. Given AI’s prevalence shows no signs of slowing down, it seems reasonably likely. In delivering his judgment on the special costs order sought by the opposing side in the aforementioned family law case, Justice Masuhara summarised AI’s position in the current landscape: “…generative AI is still no substitute for the professional expertise that the justice system requires of lawyers. Competence in the selection and use of any technology tools, including those powered by AI, is critical. The integrity of the justice system requires no less.”
There is no doubt that the use of AI tools in all walks of professional life is becoming pervasive, but these cases serve as a stark reminder that, at least for now, they are merely tools, and fallible ones at that – they do not substitute human judgment and oversight, and the work to integrate AI into the legal industry continues in earnest.
EU-UK adequacy and the DPDI Bill
Unsurprisingly, disentangling the UK from the EU has proved challenging, and discussions around what UK’s post-Brexit legislation would look like have seemingly dominated over the last few years. At least in the context of data protection, the European Commission declared its approval of the UK’s data protection regime by granting the UK an ‘adequacy’ decision, allowing data to flow freely between EU/EEA and the UK without further safeguards – until 27 June 2025, that is.
Enter the UK’s Data Protection and Digital Information Bill (No 2) (‘DPDI Bill’ or ‘Bill’), which is set to change (albeit, not radically) UK’s existing data protection regime. The intention is to simplify the existing regime, provide additional clarifications and introduce a risk-based approach aimed at reducing some of the regulatory burdens faced by businesses and scientific researchers. Some of the more significant changes include increasing the maximum fines under the Privacy and Electronic Communications Regulations 2003 (‘PECR’) to bring these in line with UK GDPR levels, removing the requirement for organisations to maintain Records of Processing Activities (‘ROPA’) except for high-risk processing, replacement of Data Protection Officers with Senior Responsible Individuals and a list of “recognised” legitimate interests which are exempt from the balancing test. At the time of writing, the Bill is in the report stage and expected to become law some time in 2024. However, the Bill is not without its detractors, particularly in the EU, who are calling for a re-assessment of the UK’s adequacy should the DPDI Bill pass in its current form.[7] Although, in no way definitive, this serves as a useful litmus test for the general feeling within EU circles, which does not bode well for the UK’s future adequacy standing.
The Information Commissioner’s Office (ICO) recently published its own views on the Bill and, although supportive of its overall objectives and welcoming it as a “positive package of reforms”[8], it noted that there were still refinements needed in a number of key areas and noted that a large number of its comments remained unaddressed, particularly with respect to the definition of high risk processing and highlighting that the Government’s introduction of a number of new clauses “amount to substantive new policy that has not been the subject of wider public consultation” .[9]
The House of Lords European Affairs Committee has launched a formal inquiry to examine the data adequacy agreement between the EU and UK and its implications for the UK-EU relationship with a particular focus on the current arrangements, potential challenges and the implications of a possible disruption to the UK-EU adequacy regime. Responses were submitted by Friday 3rd May 2024, with the Committee’s report expected to be issued in July 2024.[10]
Whilst it is quite far along in the parliamentary process, the Bill’s passing is not yet a certainty, and its fate may yet rest on the outcome of the General Election. If it doesn’t pass before, then it’s quite possible that a new administration will shelve the Bill altogether. As for the UK’s deemed adequacy, that remains to be seen – although no doubt the recent comments from the ICO will serve to incentivise other EU officials to question more closely whether the level of data protection afforded under the UK Bill is ‘essentially equivalent’ with that of the current EU regime.
Software Resilience and Security: Government response
The Government has now responded to a call for views on software risks for businesses and organisations, identifying key proposals and areas of focus for the future.
BACKGROUND
On 6 February 2023, the Government published a call for views (‘the Call’) on the cyber security risks of software and the nature of these risks for UK organisations, with a view to mitigating these risks through legislative action.[11] The Call focused upon three key areas:
- Software risks;
- Existing industry measures; and
- Future government action.
The Call concentrated on software that was used in an enterprise setting, focussing on software produced in complex supply chains where there are significant roles for multiple businesses and organisations in producing software, and on risks throughout the breadth of the software lifecycle with particular attention on accountability in the software supply chain and protecting high-risk users.
The Government’s response acknowledged the impact that AI is having, and will continue to have, upon developing software. However, the Government stated that they would continue to gather evidence to understand the link between ‘traditional’ software and AI systems, as well as how a ‘secure by design’ approach can be implemented when AI is being developed. As such, the proposals outlined in the Government’s response do not tackle AI software directly.
Future government action
In the Government’s response[12], instead of introducing regulation, the Government are going to publish a voluntary code of practice for software vendors , which will align with the Secure Software Development Framework in the US. The focus of this code of practice will be to achieve consistency in software security, and high quality software development. The code will include sections on (a) managing risk effectively through communications and (b) good practice in developing software, including the adequate testing of third-party components that vendors use before selling. The code will act as a foundation for future Government codes and white papers, to build upon existing text around the security of software.
The government are also developing other materials to strengthen software security. This includes the possible introduction of an accreditation scheme, to provide software vendors with the opportunity to showcase their compliance with the code of practice in procurement exercises, or to customers generally. Ongoing stakeholder engagement will take place to ensure that any scheme would not encourage bad practice. Standardised procurement clauses will also be created for organisations across all sectors, to help customers with the creation of consistent requests which will provide vendors with certainty. Additionally, the Government are considering minimum security standards for software provided directly to the Government by suppliers.
The response also signposted the role of the National Cyber Security Centre (‘the NCSC’) in providing guidelines which will help to inform the voluntary code of practice. The government will be assisting with the work of the NSCS, including with the provision of cybersecurity training for procurement professionals and through publishing content on the use of Software Bills of Materials (‘SBOM’).
Next steps
Depending on uptake the voluntary code of practice may receive legislative backing. For now, we are yet to be provided with a date for when the code will be published. As we wait for this, organisations should ensure that they have adequate access to information around vendors’ security practices and that vendors are held accountable in the software development lifecycle.
LockBit: have ransomware groups been sufficiently disrupted?
The National Crime Agency (‘the NCA’) have shared a key update in an international campaign targeting LockBit, a ransomware enterprise regarded as one of the most disruptive ransomware-as-a-service (‘RaaS’) providers in the world.
BACKGROUND
The NCA, in conjunction with an international taskforce which includes the FBI and agencies from nine other countries, have been investigating LockBit due to its provision to individuals of access to a ransomware platform which can be used to carry out attacks around the world, with versions designed to attack Windows and Mac environments specifically.
LockBit ransomware attacks a victim’s network, stealing their data and encrypting their system. Following this, a ransom would be solicited in cryptocurrency in return for decryption of the files and the victim’s data not being published on the dark web. It has been claimed by LockBit that their ransomware platform was involved in attacks against Boeing and Royal Mail.
‘Affiliates’ (individuals who used LockBit ransomware to carry out attacks) were known to publish data stolen from victims on the dark web. The NCA also determined that even where a ransom was paid, data obtained via LockBit had not been deleted.
Action
In a major blow to LockBit and their affiliates, the NCA announced that they have taken control of (1) LockBit’s primary administration environment, used by affiliates to carry out attacks, and (2) LockBit’s leak site on the dark web, which was used to publish victim’s data.[13] The NCA were also able to obtain LockBit platform’s source code and other information from their systems (such as the discovery of the details of nearly 200 affiliates registered with LockBit), which will inform the international taskforce’s action against LockBit and similar RaaS providers.
In particular, LockBit have operated a data exfiltration tool called Stealbit since 2021, used to withdraw files from a victim’s network before transferring them directly to the attacker. The infrastructure behind Stealbit was seized by the taskforce, as well as 28 servers linked to LockBit. All of Stealbit’s proxy servers were also located and destroyed by the NCA.
Other action being taken by the taskforce against LockBit is ongoing. This includes:
- The US Department of Justice charging two individuals who used LockBit to carry out ransomware attacks;
- The US unsealing indictments against two other Russian individuals, thereby making the US government’s charges against those individuals public, relating to those individuals conspiring to use LockBit to carry out ransomware attacks;
- Europol arresting two LockBit actors and freezing over 200 cryptocurrency accounts linked to LockBit; and
- The NCA obtaining over 1,000 decryption keys to assist UK-based victims.
It is hoped that, given the exposure of the LockBit infrastructure and publicity around this action, affiliates will no longer use LockBit. However, the group have stated their intention to continue to operate their systems and to increase attacks.
The action does mark a continued focus of law enforcement agencies on ransomware enterprises, following the seizure of ALPHV’s site (also known as BlackCat) in December 2023. However, it should be cause for concern (and further law enforcement action) that an increase in ransomware attacks has been recorded over the first quarter of 2024 as opposed to Q1 2023.[14] This is a continued reminder for organisations to mitigate or prevent weaknesses that cyber criminals may seek to exploit via ransomware attacks.
[1] Parker-Grennan v Camelot Judgment.pdf.docx (judiciary.uk)
[2] B.C. lawyer who used fake, AI-generated cases faces law society probe, possible costs | Globalnews.ca
[3] Michael Cohen Used Fake Cases Cited by A.I. to Seek an End to Court Supervision – The New York Times (nytimes.com)
[4] A Man Sued Avianca Airline. His Lawyer Used ChatGPT. – The New York Times (nytimes.com)
[5] AI: Judge sanctions lawyers over ChatGPT legal brief (cnbc.com)
[6] Lawyers Must Certify AI Review Under Fifth Circuit Proposal (bloomberglaw.com)
[7] MEPs Say UK’s DPDI Bill Could Jeopardize UK-EU Data Transfers (forbes.com)
[8] Information Commissioner’s view on the Data Protection and Digital Information Bill (DPDI Bill) – Lords Committee stage | ICO
[9] UK: ICO publishes its view on the Data Protection and Digital Information Bill | News post | DataGuidance
[10] Data adequacy and its implications for the UK-EU relationship examined – Committees – UK Parliament
[11] Call for views on software resilience and security for businesses and organisations – GOV.UK (www.gov.uk)
[12] Government response on software resilience and security – GOV.UK (www.gov.uk)
[13] International investigation disrupts the world’s most harmful cyber crime group – National Crime Agency
[14] Q1 2024 Ransomware Report: 21% Increase in Q1 2023 Ransomware Activity (corvusinsurance.com)