Legal & Government Affairs Update January 2021
General Counsel at FAST
Covered in this update
We start the new year with hope as the vaccine is rolled out and on the back of trade a deal with the European Union. In the spirit of looking ahead we take a look at the themes that are likely to dominate the news in the months ahead. We have articles focusing on the future relationship with the EU in relation to digital trade and data protection arrangements; on some of the latest legal challenges being faced by the large digital platforms; on the latest moves to regulate AI and on recent initiatives on global privacy and cybersecurity.
If there is anything you would like me to focus on in the coming months please let me know.
Brexit: The impact of the trade deal on data protection arrangements
The UK and the EU have finally agreed a Brexit trade deal, formally known as the draft EU-UK Trade and Cooperation Agreement (the “Agreement”). A key area of interest over the past year has been the impact of any trade deal on data protection arrangements between UK and EU and whilst the full details of the Agreement are still being unpicked, we now have a clear indication of what this looks like.
On the basis of article 45 of Regulation (EU) 2016/679, the European Commission has the power to determine whether a country outside the EU offers an adequate level of data protection. The effect of such a decision is that personal data can flow from the EU (and Norway, Liechtenstein and Iceland) to that third country without any further safeguard being necessary. In others words, transfers to the country in question will be assimilated to intra-EU transmissions of data.
Whilst the EU and UK have both committed to upholding high standards of data protection, there is no news yet on whether the Commission determines the UK’s data protection regime “adequate”. This decision is a separate process to the trade deal and we expect the Commission’s decision in the coming months.
Whilst there is hope that an adequacy decision will be reached, it should be noted that this is not guaranteed and there are clearly parts of the UK’s regime that the EU are concerned about.
For example, in the case of Privacy International v Secretary of State for Foreign and Commonwealth Affairs and Others, concerns were raised over the incompatibility of UK and EU law in relation to the retention and transmission of bulk data for national security purposes.
It is worth noting, that in the absence of an adequacy decision, UK organisations will be required to continue to comply with the EU (not UK) GDPR in relation to the personal data of non-UK data subjects. In reality, this is unlikely to make a material difference. However, there may be minor differences depending on which legislation applies. For example, UK courts would still need to pay "due regard" to any CJEU decisions made after 31st December 2020 in relation to the personal data of non-UK data subjects: (1) already processed under EU law before 31st December 2020 or (2) processed from 31st December 2020 on the basis of the Withdrawal Agreement, for example pursuant to a provision of EU law that applies in the UK by virtue of the Withdrawal Agreement, such as its provisions on citizens' rights) (“Legacy Data”).
If an adequacy decision is granted then UK organisations may simply apply the UK GDPR to all of their data, including Legacy Data.
Whilst we await the Commission’s adequacy decision, the Agreement provides for a 6-month transition period (beginning 1st January 2021), during which the transmission of personal data from the EEA to the UK shall not be considered as a transfer to a third country under EU law. This means that data can continue to flow without the need for additional protective measures on an interim basis, saving many organisations the time, money and uncertainty that would have been incurred in trying to make last minute arrangements.
As an indirect consequence, it would also appear that the requirements of Schrems II (as reported in our last newsletter) will not apply during the transition period, since the UK would not be classed as a “third country”. We would expect the UK Information Commissioner’s Office (ICO) to publish a statement confirming this, as it is an area of uncertainty that many organisations may be thinking about
As a concession to the interim period being granted, the UK has agreed that it will not amend its data protection laws from the form they take as at 31 December 2020 or exercise any of their "designated powers" in relation to international transfers without the EU's agreement. Should the UK decide to change any of its data protection laws (other than to align to EU data protection law), or exercises any these designated powers without consent, the interim period will automatically end.
The Agreement also provides for cooperation between the EU and UK on DNA, fingerprint and vehicle registration data and the sharing of Passenger Name Records and criminal record information. Both parties have reserved the right to revoke these provisions in the event of “serious and systemic deficiencies” in the other side's data protection requirements, including (but not limited to) where an adequacy decision has been revoked by either side.
Global Privacy Expectations of Video Teleconference Providers: an update
In our last update, we reported on the use of Video Teleconferencing (VTC) services and the open letter published by six Authorities (including the UK ICO).
The letter recognised the value of video teleconferencing in keeping people connected, but set out concerns about whether privacy safeguards are keeping pace with increased risks from the sharp uptake of these services during the current pandemic. The joint signatories provided video teleconferencing companies with principles to guide them in addressing some key privacy risks.
The joint signatories sent the letter directly to Microsoft, Cisco, Zoom, Houseparty and Google. These companies were invited to respond and demonstrate how they take the principles into account in the development and operation of their video teleconferencing offerings.
Microsoft, Cisco, Zoom and Google replied to the open letter, highlighting various privacy and security best practices, measures, and tools that they advise are implemented or built-in to their video teleconferencing services. The ICO have stated that the responses provided are encouraging and serve as a constructive foundation for further discussion on elements of the responses that the joint signatories feel would benefit from more clarity and additional supporting information.
Moving forward, the joint signatories will undertake further engagement with these companies. They will invite more detail on the privacy and security safeguards built-in to the video teleconferencing services, and give the companies the opportunity to demonstrate how they achieve, monitor, and validate the measures set out in their responses.
Social media platform TikTok, who enjoyed a meteoric rise in 2020, are facing a potential legal challenge from a 12-year-old girl for "loss of control of personal data". She is also seeking deletion of her personal data. The action is supported by the Children's Commissioner for England, Anne Longfield, who believes that TikTok has broken UK and EU data protection laws. Her hope is that the High Court will issue an order forcing the firm to delete the child's data, setting a precedent and creating greater protective measures for under-16s who use TikTok in England and possibly beyond.
The app, which has reportedly been downloaded 2 billion times online, allows users to upload short clips of themselves, often lip-syncing or dancing to music. The focus of the preliminary hearing was to establish whether the case could proceed with the 12-year-old claimant remaining anonymous. Mr Justice Warby granted this, judging that the girl risked “hostile reactions from social media influencers who might feel their status or earnings were under threat”.
There are concerns that the company have unlawfully processed children’s personal data “in order to garner advertising revenue from corporate clients”.
Charles Ciumei QC, acting for the claimant stated that “the personal data at issue is used in an algorithm which analyses the user's preferences in order to tailor the content presented to them to capture and keep their attention. This in turn encourages use of the app and, although it is stated in the app's terms of service that it is not for use by those under 13 years old, it is clear that a large number of users are under that age”.
He argues that the app “is targeted specifically at children with some of its most prominent stars aged 13 or thereabouts, having joined the app at a younger age”, and that the personal data collected by Tik Tok is “extensive”. This includes users' names, dates of birth and location, as well as photographs and videos they have made and "device information, IP address, information from connected accounts such as Facebook, browsing history, cookies and metadata".
Ms Longfield is awaiting the ruling of an outstanding data protection case against Google before deciding whether to pursue a claim against TikTok. In 2019, TikTok was fined $5.7m (£4.2m) by the US Federal Trade Commission for its handling of children's data, with South Korea also issuing a fine for similar reasons.
In a statement, TikTok said: "Privacy and safety are top priorities for TikTok and we have robust policies, processes and technologies in place to protect all users, and our younger users in particular. As this application was made without notice, we first became aware of the application and the High Court's judgment and are currently considering its implications".
This case is likely to gather a lot of attraction and provides yet another timely reminder of the challenges that lie ahead in ensuring that companies are collecting and using personal data ethically. We are already a witnessing a rise in data subjects demanding ownership and greater transparency of how their personal data is being used and we expect this trend to continue in the coming year.
Microsoft and McAfee headline newly-formed 'Ransomware Task Force'
Nineteen security firms, technology companies (including Microsoft and McAfee) and non-profit organisations have announced the launch of the Ransomware Task Force (RTF), aimed at solving the increasing risk of ransomware.
The group’s focus will be the assessment of existing technical solutions that provide protections during a ransomware attack. They will commission expert papers on the topic, engage stakeholders across industries, identify gaps in current solutions, with the view to establishing a standardized framework for dealing with ransomware attacks, based on an industry consensus rather than individual advice received from lone contractors.
Business Email Compromise (BEC) and Email Account Compromise (EAC) continue to represent the largest cyber security threat to individuals and organisations, however the prevalence of ransomware is also increasing.
The Institute for Security and Technology issued a statement upon the press release of the RTF’s launch, stating that ransomware attacks “transcend sectors and requires bringing all affected stakeholders to the table to synthesize a clear framework of actionable solutions, which is why IST and our coalition of partners are launching this Task Force for a two-to-three month sprint”. It is expected that The Ransomware Task Force website, including full membership details and leadership roles, will be launched this month. With this in mind, we would expect to see some form of standardised guidance from the RTF by April this year.
International Trade Committee launches inquiry into digital trade and data
The House of Commons International Trade Committee have launched a new inquiry into digital trade and data. It defines digital trade as "digitally enabled, or digitally delivered, trade in goods and services", and notes that such trade involves the movement of data. The purpose of the inquiry is to explore a range of issues including digital trade and data provisions in free trade agreements (FTAs), concerns around the security and privacy of data and the environmental impact of digital trade, and relevant legal frameworks.
The call for evidence welcomes submissions by 5.00 pm on 12 February 2020, addressing some or all of the following questions:
- What are the main barriers faced by UK businesses engaging in digital trade?
- What opportunities does digital trade present for UK businesses?
- How does the regulation of digital trade impact consumers?
- What approach should the UK take to negotiating digital and data provisions in its future FTAs?
- What does the UK-Japan trade agreement indicate about the UK's approach to digital trade and data provisions in future trade negotiations?
- What approach should the UK take towards renewing the WTO's moratorium on customs duties on electronic transmissions? What objectives should the UK have when negotiating digital and data provisions during its accession to the CPTPP?
- Will the global increase in digital trade affect the environment in a positive or negative way? What steps can be taken to mitigate any negative environmental impacts of increased digital trade?
- What domestic and international law is relevant to the government's approach to digital trade?
Over the course of 2021, the UK is expected to continue or commence FTA negotiations with Australia, New Zealand, the US, Mexico, Canada, and the non-EU members of the EEA (Norway, Iceland and Liechtenstein), as well as seek accession to the Comprehensive and Progressive Agreement for Trans-Pacific Partnership (CPTPP) and commence negotiations for a digital economy agreement with Singapore. The findings of the inquiry are likely to be of particular relevance to these negotiations, given that they are expected to include commitments on digital trade and data.
House of Lords Liaison Committee report on artificial intelligence
In December, The House of Lords Liaison Committee published a report entitled “AI in the UK: No Room for Complacency”. The report stems from the recognition that the government needs to better coordinate its artificial intelligence (AI) policy and the use of data and technology by national and local government. The report examines the progress made by the government in the implementation of the recommendations made by the Select Committee on Artificial Intelligence in its report published in 2018.
The main conclusions and recommendations of the report are that:
- The government needs to improve the co-ordination of its artificial intelligence (AI) policy and the use of data and technology by national and local government.
- There is a clear consensus that ethical AI is the only sustainable way forward.
- The government must move from deciding what the ethics are to how to instil them in the development and deployment of AI systems.
- The increase in reliance on technology caused by the COVID-19 pandemic has highlighted the opportunities and risks associated with the use of technology and data. Active steps must be taken by the government to explain to the general public the use of their personal data by AI.
- The government must take immediate steps to appoint a chief data officer, whose responsibilities should include acting as a champion for the opportunities presented by AI in the public service, and ensuring that understanding and use of AI, and the safe and principled use of public data, are embedded across the public service.
A full copy of the Report can be accessed here.
It would be remiss to neglect mentioning the ongoing case of Google v Oracle, which represents somewhat of a blockbuster in the world of intellectual property law.
The U.S Supreme Court case sees the conclusion of a decade of the two Silicon Valley behemoths battling over Google’s alleged unauthorized use of 11,500 lines from Oracle’s Java APIs (Application Programming Interfaces) declaring code in its Android operating system.
Last year, the Supreme Court agreed to hear Google’s petition to address two questions:
- Whether copyright protection extends to a software interface; and
- Whether Google’s use of a software interface in the context of creating a new computer program constitutes fair use.
Google have sought to show that APIs fall within a subset of computer code that is so functional that it cannot be protected by copyright. According to Google, because declaring code allows one program to efficiently communicate with another, it constitutes an essential building block of software development that allows different computer programs to seamlessly communicate with each other.
In March 2018, the Federal Circuit ruled in favour of Oracle, finding that Google’s use of Java APIs was not fair as a matter of law, reversing the district court’s earlier decision. The case was appealed to the Supreme Court in January 2019, after the Federal Circuit denied rehearing in August 2018.
On October 7 2020, the U.S. Supreme Court heard oral arguments from both sides, with some critics believing that the line of questioning from the 8 Supreme Justices indicates that the Supreme Court will likely affirm a ruling that Oracle’s software interface is protectable by copyright.
In the software industry, it is common practice for technology companies to repurpose APIs, in order to make their products interoperable with the most popular platforms. Would a ruling against Google set a significant precedent giving technology companies the control to limit how other companies use their APIs? Could we see an industry shift towards companies monetising their APIs and charging licence fees for them, similar to Software-as-Service (SaaS) products? This remains to be seen.
However, from a legal perspective, it is worth pointing out any judgement handed down in the United States would have no applicability to the UK’s intellectual property regime, where APIs are not protected from copyright. This principle is derived from the European Software Directive (2009/24/EC), which continues as EU retained law post- Brexit. Article 1(2) of the directive explicitly states that “ideas and principles which underlie any element of a computer program, including those which underlie its interfaces, are not protected by copyright”.
Whilst the CJEU have never ruled directly on the question of whether APIs are protected by copyright, they have considered this question in relation to SaaS products. In the case of SAS Institute, Inc. v World Programming Limited, it was held that “neither the functionality of a computer program nor the programming language and the format of data files used in a computer program in order to exploit certain of its functions constitute a form of expression of that program".
Although the final ruling will only have direct applicability to the U.S, many of us within the industry await the final decision with keen interest. The significance of a ruling in favour of either party cannot be underestimated and whilst COVID-19 has heavily disrupted the case, we would expect a ruling in the coming months.
Book Review - Don't Be Evil: The Case against Big Tech – Rana Faroohar
Rana Faroohar, an award-winning Financial Times columnist, provides a challenging critique of Big Tech and how it should be regulated. According to Faroohar Google and Facebook receive 90% of the world's news ad-spending. Amazon takes half of all ecommerce in the US. Google and Apple operating systems run on all but 1% of mobile phones globally.
She shows the steps that have been taken by the “FAANG”s (Facebook, Apple, Amazon, Netflix and Google) to absorb competitors, monetise personal data and minimize their tax liabilities. Faroohar lays out the case for a regulatory framework that fosters innovation while also offering up greater consumer protection measures. Her book is worth reading as it clearly reflects the mood within the political class as we can see measures now being proposed for the regulation of Big Tech on both sides of the Atlantic. Her contribution is a timely one in that it brings together in one place all the arguments being advanced for such regulation.