May Data Wrap: A snapshot of key regulatory developments

On 22 May 2023, following the adoption of a binding decision by the EDPB, the Irish Data Protection Commissioner (“DPC“) concluded its own-volition inquiry against Meta regarding the legality of international data transfers from Meta Ireland to the US.

The DPC concluded that such transfers infringed the GDPR and directed Meta to suspend its transfers to the US within six months of the decision and bring its transfers into compliance with the GDPR. It also issued a fine of €1.2 billion which, whilst undoubtedly extremely significant, is nonetheless at the lower end of the scale of fines the EDPB recommended after the DPC originally decided not to impose a fine at all.

The decision highlights that not even the 2021 SCCs can be relied upon in all circumstances, and it remains unclear what organisations who wish to send personal data to companies in the US are supposed to do.

See our blog post on the decision and its implications here.

Following the European Data Protection Board’s (“EDPB“) opinion expressing doubts over the European Commission’s draft EU-US Data Privacy Framework (“DPF“) adequacy decision, the European Parliament adopted a resolution on 11 May 2023 which further endorses the EDPB’s concerns.

Members of the European Parliament have challenged the DPF for the following reasons:

  • Bulk data collection: The DPF allows US entities to collect data in bulk without being subject to independent prior authorisation in certain circumstances and provides no clear rules on data retention;
  • The Data Protection Review Court: The DPF creates the Data Protection Review Court and empowers it to make decisions in secret, which members of the European Parliament consider would violate EU citizens’ rights of access and rectification; and
  • Right to redress: The DPF does not provide data subjects in the EU equivalent rights to redress and access to information as those afforded to US citizens.

Whilst this is not legally binding, it is expected to influence the European Commission’s upcoming decision on whether the DPF will be granted adequacy.

For further background information please refer to “Privacy Shield 2.0” here.

Following a complaint from a data subject that they had suffered harm suffered which was ‘insulting’ and ‘shameful’, caused upset, a loss of confidence and unnecessary public exposure, the CJEU has ruled that infringement of the EU GDPR alone, is insufficient for damages. It further ruled that it would be ‘superfluous’ if a breach on its own was enough to warrant compensation.

In its judgment in the Austrian Post case (available here), the court confirmed that there are three cumulative requirements to the right to compensation contained under the GDPR: (i) an infringement of the legislation; (ii) material or non-material damage to the individual; and (iii) a causal link between limbs (i) and (ii).

However, the court stopped short of imposing a seriousness threshold in relation to non-material damage, and did not consider the issue of ‘loss of control’ as a non-material damage that conferred a right to compensation.

The judgment therefore appears to be one of two halves, confirming infringement alone is not enough for compensation, but at the same time refusing to impose any threshold of seriousness and stopping short of commenting on the assessment of potential damages, meaning we have to wait and see how each Member State interprets the rules to assess damages in any particular case.

In another CJEU ruling this month, the court considered the extent of an organisation’s obligations in response to a DSAR.

The case concerned CRIF, a consulting business that provides third party credit scores. The claimant asked CRIF to provide them with a copy of the documents containing his personal data. CRIF sent a summary table of his personal data rather than the underlying documents. The subject complained to the Austrian Data Protection Authority who found that CRIF had not infringed his access rights but referred certain questions to the European court.

The CJEU ruled that the data subject must be given a faithful and intelligible reproduction of all its personal data. ‘Copy’ was interpreted to relate to the personal data itself rather than the underlying document. However, a descriptive summary of the data would not suffice. The obligation might be fulfilled by providing an extract of the document, however in some circumstances the entire document may be required.

The CJEU ruling is available here.

On 26 April 2023, the CJEU ruled on complaints from data subjects that the Single Resolution Board (an EU agency which acts as the resolution authority for a subset of banks in the euro area) shared their pseudonymised data without informing them.

The Single Resolution Board used electronic forms to gather responses from interested parties which the agency then shared with a consulting firm. Each respondent was assigned a code in an attempt to pseudonymise the information.

The CJEU highlighted that it was important to consider the data recipient’s perspective and found that transmitted data can be considered anonymised rather than pseudonymised if the data recipient has no additional information which would allow them to re-identify the data subject, or legal means to access such information.

The CJEU ruling is available here.

On 28 April 2023 the European Commission proposed amending the EU Cybersecurity Act to: (i) increase the remit of the European Cloud Services scheme; and (ii) introduce a new subcategory of cybersecurity certification for cloud service providers (“CSPs“).

Under the proposal, cybersecurity certification would become mandatory for managed security services, which include essential or important entities belonging to a sector of high criticality under the Networks and Information Security Directive.

The new subcategory of cybersecurity certification would add ‘high+’ to the established assurance subcategories. To achieve certification of the ‘high+’ subcategory, CSPs would be required to fulfil sovereignty criteria in order “to provide some guarantees about the [entity’s] independence from non-EU law.” These criteria include requirements:

  • for entities with effective control over CSPs to be based in the EU;
  • for technical and organisational measures to ensure the primacy of EU law;
  • to conduct all data processing activities in the EU except for some limited circumstances; and
  • to employ specific internal controls to govern employee access to customer data.

Following the release of the US’s blueprint for an AI Bill of Rights in October 2022, the White House announced in May 2023 that it plans to dedicate more funding and policy guidance to developing responsible AI. This comes ahead of the long anticipated public evaluation of top AI industry models, including Google, Microsoft, Nvidia and OpenAI, during this year’s DefCon.

The Office of Management and Budget also announced that it will publish draft rules this summer which will dictate how the federal government should use AI technology. These actions are part of the US’s wider goal to ensure that the private sector fulfils its ethical, moral and legal responsibilities to ensure that their products are safe.

For further information around other international approaches to regulating AI please refer to “Spotlight on AI Regulation” here.

To subscribe to our HSF Data Blog please click here.

Miriam Everett
Miriam Everett
Partner
+44 20 7466 2378
Claire Wiseman
Claire Wiseman
Professional Support Lawyer
+44 20 7466 2267
Duc Tran
Duc Tran
Of Counsel
+44 20 7466 2954
Angela Chow
Angela Chow
Senior Associate
+44 20 7466 2853
Lauren Wilkinson
Lauren Wilkinson
Associate
+44 20 7466 2483
Katie Reid
Katie Reid
Paralegal, London
+44 20 7466 2962

FOLLOWING META, WHAT NEXT FOR INTERNATIONAL DATA TRANSFERS?

On 22 May 2023, following the adoption of a binding decision by the European Data Protection Board (the “EDPB“), the Irish Data Protection Commissioner (“DPC“) concluded its own-volition inquiry against Meta regarding the legality of international data transfers from Meta Ireland to the US. The DPC concluded that such transfers infringed the GDPR and directed Meta to suspend its transfers to the US within six months of the decision. As directed by the EDPB, it further issued a fine of €1.2 billion and ordered Meta to bring its transfers into compliance with the GDPR.

This decision comes shortly after the DPC’s separate ruling (which also involved a prior binding decision from the EDPB) on the lawful bases relied upon by Meta in the provision of its Facebook and Instagram services.

Background and Decision

In July 2020, the ECJ’s Schrems II decision invalidated the EU-US Privacy Shield that was used as a mechanism to transfer personal data from the EU to the US. In light of the Schrems II decision, the Irish DPC commenced an ‘own-volition inquiry’ on 28 August 2020 into (i) the lawfulness of Meta’s international data transfers in respect of EU/EEA individuals to the US pursuant to standard contractual clauses and (ii) whether corrective powers should be exercised by the Irish DPC pursuant to Article 58(2) of the GDPR.

Given Meta’s European headquarters in Ireland and the scope of its European operations, the Irish DPC assumed the role of lead supervisory authority under the GDPR and followed the decision-making process set out in Article 60 of the GDPR, where the lead supervisory authority is required cooperate with and consult other concerned supervisory authorities (“CSAs“). In July 2022, the DPC issued its draft decision for consultation to the CSAs to provide “relevant and reasoned objections” pursuant to Article 60(4) of the GDPR. A number of CSAs disagreed with the DPC’s approach and, after failing to reach a consensus, the matter was ultimately referred to the EDPB for a binding decision pursuant to Article 65 of the GDPR.

Pursuant to the EDPB’s direction, the DPC ruled that Meta’s use of the EU standard contractual clauses (both the 2010 and 2021 versions) combined with extensive supplemental measures (such as organisational, technical and legal measures) nonetheless did not address US surveillance laws to provide an adequate level of protection to transfers of personal data of EEA/EU data subjects and therefore, such data transfers were unlawful. Further, Meta could not rely on any derogation under Article 49(1) of the GDPR to justify such data transfers in the usual course.

Analysis and Implications

The Fine

The €1.2 billion fine is the largest fine imposed under the GDPR by any supervisory authority to date. Although the level of fine falls short of the maximum 4% of annual worldwide turnover available under the GDPR, it also goes to highlight the disagreement and lack of consensus between the Irish DPC, the CSAs and the EDPB. The Irish DPC did not originally propose any fine, considering the corrective measures it had proposed (i.e. an order to suspend transfers) to be sufficient. However, four of the CSAs disagreed and, in its own decision, the EDPB directed that a fine should be imposed and should be set at a level representing 20% to 100% of the applicable legal maximum. An (admittedly rough) calculation suggests that the €1.2 billion fine issued by the DPC was towards the lower end of this scale, perhaps demonstrating a desire on the part of the DPC to only comply with the EDPB’s direction to the minimum extent permissible.

Can we rely on Standard Contractual Clauses now?

The decision is fundamental in that it casts a doubt on the adequacy of the European Commission-approved standard contractual clauses that are widely used by a large number of organisations, small and big for cross-border data transfers.

The standard contractual clauses are a legal mechanism provided for under the GDPR to give protection to personal data being transferred outside of the EEA. The 2020 Schrems II decision cast doubt upon this mechanism but a ‘new’ set of standard contractual clauses was published by the European Commission in 2021 (the “2021 SCCs“) and contained provisions intended to address some of the shortcomings identified in the Schrems II case.

However, the Meta decision seems to put everyone back to square one by confirming that not even the 2021 SCCs can be relied upon in all circumstances. And if reliance on the standard contractual clauses is not an option, the Irish DPC’s decision also makes it clear that reliance on the derogations set out in the GDPR (e.g. the transfer is necessary for a contract) must be relied upon on an exceptions only basis. As of now, it remains unclear what organisations who wish to send personal data to companies in the US (particularly tech companies subject to FISA) are supposed to do.

Can we put in place supplementary measures?

It is clear from the DPC decision that it considered that Meta had put in place a variety of “supplementary measures” over and above reliance on the standard contractual clauses in order to try and ensure adequate protection for data subjects. These included both technical and organisational measures. However, although these measures could be said to mitigate risks, the DPC was of the view that they could not compensate for the deficiencies in US law highlighted in the Schrems II case. Put bluntly, it appears that there are  no supplementary measures that could be put in place where the data importer is subject to FISA legislation and has access in the clear to the personal data transferred.

Does this mean all EU-US transfers need to stop?

For Meta, in the short term it seems clear from its response to the decision that it will be appealing and seeking a stay of the order to suspend data transfers. For everyone else, this will hopefully provide some much needed breathing space to (once again) map their data transfers; understand the extent to which personal data is being transferred to the US (both directly and through their vendor supply chain); and consider the ways in which it may be possible to reduce or mitigate risks associated with any such US transfers.

Is there going to be an EU-US adequacy decision to solve this?

In the medium term, the solution for international data transfers to the US seems likely to be the adoption of a new US adequacy decision. The DPC commented in its decision on EO 14086, the executive order adopted by US President Joe Biden that introduces the framework for additional controls and protections under US law. However, the protections under EO 14086 are not yet operational and the jury is out on whether, once available, these protections could provide further comfort for data transfers to the US. Even if there is a new EU-US adequacy decision, it seems likely that this would be challenged and may well end up being invalidated like the Safe Harbor and Privacy Shield before it. However, any such challenge takes time and so, in theory at least, a new adequacy decision would provide organisations with a few years of greater certainty.

What else?

Crucially, US data transfers are only a small subset of data transfers taking place every day and the decision highlights the urgent need for a stable and long-term solution to the issue of international data transfers. The DPC’s decision will not affect Meta’s operations in the UK but the challenge of transferring data lawfully from the UK to US (and elsewhere) remains and it will be interesting to see how the ICO responds.

Finally, the decision provides yet another example of the functioning and the efficacy of the one-stop shop solution. Recent examples have shown that it is challenging for supervisory authorities to reach consensus on the application of the GDPR requiring referrals to the EDPB. Perhaps it is timely that the EDPB published a statement ahead of the 5th anniversary of the GDPR indicating its intention to introduce legislation to harmonise the procedures of cooperation between data protection authorities on cross-border data protection cases.

 

Miriam Everett
Miriam Everett
Partner
+44 20 7466 2378
Asmita Singhvi
Asmita Singhvi
Associate
+44 20 7466 3697

 

Pseudonymised data is personal data – but in whose hands? ICO calls for views on third chapter of draft anonymisation guidance

On 7 February 2022, the Information Commissioner’s Office (“ICO“) announced the publication of the third chapter of its draft guidance on anonymisation, pseudoymisation and privacy enhancing technologies (the “Draft Guidance“). Following on from the first and second chapters published on 28 May 2021 and 8 October 2021, respectively, which focus on anonymisation, the new third chapter aims to clarify the much debated concept of pseudonymisation. Continue reading

HAPPY INTERNATIONAL DATA PRIVACY DAY: OUR PREDICTIONS FOR 2022

Happy International Data Privacy Day! And what better day than today, to explore what 2022 is likely to have in store for data and privacy?

One year on from the introduction of the UK GDPR in a post-Brexit Britain. Two years on from the start of a global pandemic which forced a discussion around the tension between public health and data privacy. And over three years on from the GDPR coming into force across Europe, and by extension the world. But the passing of time does not appear to have diminished the worldwide focus on data and privacy issues.

In this post, we set out some predictions for data protection and privacy UK and EU developments in the year to come.

UK Data Protection Reform

2021 was the year that the UK Government hinted that it might think outside of the box in terms of data protection regulation. In September 2021, the UK Department for Digital, Culture, Media and Sport (“DCMS“) published its wide-ranging consultation on data protection reform. The DCMS Consultation is the first step in the Government’s plan to deliver on ‘Mission 2’ of the National Data Strategy, underpinned by a desire to boost innovation and economic growth for UK businesses while strengthening public trust in the use of data. The proposals were expansive, seeking to create an adaptable and dynamic set of data protection rules that underpin the trustworthy use of data. They mark a move away from a rigid set of rules, towards a more outcome focussed regime, in order to reduce burdens on business. The consultation closed in November 2021 and the results are expected in Spring 2022. For further detail about the reform proposals, please see our blog post, available here.

A new regulator for the UK

On 4 January 2022, John Edwards began his new role as UK Information Commissioner today, on a five year term. The new regulator spent the past eight years as New Zealand Privacy Commissioner, and before that worked as a barrister. He succeeded Elizabeth Denham CBE, whose term as UK Information Commissioner ended last year. The new Information Commissioner’s agenda/approach/priorities will become clearer during his first full year in the role. However, it seems likely that one of his top priorities for 2022 will likely be the introduction of the Age Appropriate Design Code to better protect children online, together with the Online Safety Bill.

The fallout from enforcement – privacy notices and cookies

2021 saw some significant enforcement action – including fines of EUR 746 million, EUR 225 million and EUR 150 million. Interestingly, these significant fines haven’t resulted from big data security breaches but rather we have seen a regulatory focus on data protection principles –particularly transparency – and cookies. Whilst in the UK at least, it is possible that current rules around cookie consents may be ‘relaxed’ as a result of the data reform proposals described above, its seems likely that this kind of significant enforcement could result in widespread updates to privacy notices and cookies practices in 2022. For further details regarding the likely impact on privacy notices in particular, please see our summary, available here[1].

Testing the EU cooperation mechanism

Although 2021 has seen significant EU GDPR enforcement action as described above, it has also shone a spotlight on the apparent differences of opinion between Member State regulators when it comes to enforcement. In the 2021 WhatsApp enforcement action, objections raised by the EU regulators to the Irish Commissioner’s proposed enforcement resulted a referral to the EDPB for resolution. In December 2021, concerned MEPs also sent a letter to EU Commissioner Reynders to raise concerns about how the Irish Commissioner enforces the GDPR and applies the GDPR’s cooperation mechanism. The MEPs reportedly asked Commissioner Reynders to initiate infringement proceedings against the Irish Commissioner. What is clear is that there is a significant discrepancy between EU supervisory authorities regarding enforcement and the appropriate approach to the same. Could 2022 be the year that the GDPR’s cooperation mechanism is tested to its limits? Or could we see individual Member State regulators forging their own path?

International data transfers – Volume 1 (EU SCC re-papering)

On 27 September 2021, the new EU standard contractual clauses (“New EU SCCs“) came into force for the transfer of personal data from the EEA to third countries under the EU GDPR. From that date, the New EU SCCs have been used for any new agreements entered into that rely on model EU data transfer clauses to legitimise the transfer of personal data from the EEA to third countries under the EU GDPR. Existing Agreements incorporating the old EU SCCs remain valid and provide appropriate safeguards until 27 December 2022, meaning that for many organisations 2022 is likely to involve the not insignificant task of “re-papering” agreements relying on the old EU SCCs and replacing them with the new EU SCCs. For further details regarding the New EU SCCs, please see our blog posts, available here and here.

International data transfers – Volume 2 (the UK position)

In August 2021, the UK Information Commissioner published a consultation on international data transfers. The regulator published a draft international data transfer agreement to address transfers of personal data outside of the UK; a draft international transfer risk assessment guidance note and tool; and a draft UK addendum for inclusion to the European Commission’s standard contractual clauses. The consultation closed on 7 October 2021 and we expect to see legislative proposals in 2022, which will finally give organisations certainty on the approach that the UK is taking to international data transfers, although it is unlikely to be the end of the data transfer saga depending upon the results of the DCMS data protection reform consultation described above. For further details regarding the ICO’s international data transfer proposals, please see our blog post, available here.

International data transfers – Volume 3 (Safe Harbor 3.0?)

Shortly after the Schrems II judgment, the US Department of Commerce and the European Commission initiated discussions to evaluate the potential for an enhanced EU-US Privacy Shield framework to comply with the ruling. However, discussions do not seem to have obviously progressed much during 2021 and, without root and branch reform of US surveillance law, it remains unclear how any such framework would avoid the fate of its predecessors the Privacy Shield and US Safe Harbor. Could 2022 be the year that governments in multiple jurisdictions manage to find a way through the legal complexities raised by the Schrems II judgment in order to allow the international transfer of data on a practical level?

ePrivacy and cookies

We have covered the proposed ePrivacy Regulation in our previous data protection predictions and yet the question remains as to whether 2022 is going to be the year that this legislation makes it through the process. Even without the proposed new EU Regulation, some EU regulators have made their focus on cookies very clear – the CNIL has recently taken significant enforcement action against both Google and Facebook for breaches of the cookie rules. The recent DCMS data protection reform consultation also focussed in part on cookies and questioned the appropriateness of the current rules relating to cookie consents. As a result, whether via legislative or reform or regulator action, it seems clear that cookies will be a special dish in 2022.

Tech vs data regulation – the race continues

2021 has seen a continued focus from organisations and regulators alike on innovative technologies and, in particular, AI. Uptake of AI by organisations appears to have increased alongside attempts by data protection regulators to keep pace, protect the privacy of individuals, and ensure fairness in an increasingly AI-driven world. An example of this was the UK Information Commissioner’s 2021 consultation in relation to the use of the beta version of its AI and data protection risk mitigation and management toolkit. We expect to see even more focus in 2022 on the use of AI and innovative technologies against the backdrop of data privacy legislation. For further details on the ICO AI consultation, please see our blog post, available here.

Class actions reborn?

In November 2021, the Supreme Court overturned the Court of Appeal’s decision in the high profile Lloyd v Google case, which could have opened the floodgates for class actions for compensation for loss of control of personal data to be brought on behalf of very large numbers of individuals without identifying class members. The case was brought under the DPA 1998, rather than the GDPR which superseded it. Whilst there may be read across to the current UK GDPR regime, Lord Leggatt specifically stated that he was not considering the later legislation and this could potentially leave the door open for future loss of control claims under the current law. After Morrisons and now Lloyd v Google, could 2022 be the year that we see another attempted data class action reach the courts? For further details regarding the Supreme Court judgment in the Lloyd v Google case, please see our blog post available here.

[1] First published by LexisNexis in October 2021

Miriam Everett
Miriam Everett
Partner, Digital TMT, Sourcing and Data, London
+44 20 7466 2378
Duc Tran
Duc Tran
Of Counsel, Digital TMT, Sourcing and Data, London
+44 20 7466 2954
Angela Chow
Angela Chow
Senior Associate, London
+44 20 7466 2853
Chloe Kite
Chloe Kite
Associate, London
+44 20 7466 2540

New Telecommunications Telemedia Data Protection Act (TTDSG) comes into effect on 1 December 2021

More legal clarity

On 1 December 2021, a new law regulating data protection and privacy in telecommunications and telemedia will come into effect: the German Telecommunications Telemedia Data Protection Act (TTDSG). The TTDSG contains new provisions on digital legacy, privacy protection for terminal equipment and consent management. It intends to create more legal certainty and legal clarity for the protection of privacy in the digital world: For example, it aims to stem the cookie deluge and give website visitors more control over the data they collect. But not only that: it intends to provide more clarity in the regulatory jungle of the EU General Data Protection Regulation (GDPR), the ePrivacy Directive (yet to be implemented in Germany), the German Telemedia Act (TMG) and the German Telecommunications Act (TKG). To this end, the data protection provisions of the TMG and the TKG are repealed and merged in the TTDSG. In the process, adjustments were also implemented that were necessary due to the GDPR and the ePrivacy Directive. Continue reading

ICO publishes consultation on the AI and data protection risk toolkit

Executive Summary

  • On 12 October 2021, the Information Commissioner’s Office (“ICO“) opened its consultation in relation to the use of the beta version of its AI and data protection risk mitigation and management toolkit (the “Consultation“).
  • The Consultation runs until 1 December 2021 and the ICO is seeking responses from all industry sectors and from organisations of all types that engage in the “development, deployment and maintenance of AI systems that process personal data”.
  • The AI and data protection risk mitigation and management toolkit (the “AI Toolkit“) provides organisations with a framework against which to assess internal AI risk by identifying potential risks for consideration and offering practical, high-level steps on how organisations can mitigate such risks.

Continue reading

Online Privacy Code: More Transparency and Minimum Standards for Online Platforms in Australia

 G20 nation moves to modernised privacy code for online platforms, including binding rules. The proposed scope – and stakes for industry players – is substantial.

On 25 October 2021, the Australian Attorney-General’s department released, for public consultation, an exposure draft bill introducing amendments to the Privacy Act 1988 (Cth) (the Privacy Legislation Amendment (Enhancing Online Privacy and Other Measures) Bill 2021 (Cth) “Online Privacy Bill”) and a discussion paper seeking submissions on broader reforms to Australian privacy legislation (“Discussion Paper“). Our overview of the Online Privacy Bill and Discussion Paper is available here.

One of the main amendments proposed by the Online Privacy Bill is the introduction of a framework allowing the Office of the Australian Information Commissioner (OAIC) to register an OAIC – or industry – developed, enforceable online privacy code (“OP Code”) that would be binding on all large online platforms, social media services and data brokerage services providers (“OP Organisations”). This would supplement the current provisions under Part IIIB of the Privacy Act dealing with the development and registration of, and compliance with, APP codes that set out how one or more of the Australian Privacy Principles (APPs) will apply to a particular entity or class of entities (and may impose additional requirements). Currently there are two registered APP codes: one developed by the OAIC for Australian government agencies, and one developed by the Association of Market and Social Research Organisations (now the Australian Data and Insights Association) for its members.

Large online platforms and social media services are broadly defined in the Online Privacy Bill. This means a wide range of organisations with online operations could be affected by the proposed OP Code, going beyond the ACCC’s recommendation in its 2019 digital platform inquiry final report to create a privacy code enforceable against social media platforms, search engines and other digital content aggregation platforms.

Along with the removal by the Bill of the condition that a foreign organisation has to collect or hold personal information in Australia to be subject to the Privacy Act, this would also include an organisation that collects personal information of Australians from a digital platform that does not have servers in Australia.

KEY TAKEAWAYS

Submissions on the new Online Privacy Bill close on 6 December 2021. In engaging with the consultation and preparing for the implementation of the OP code, impacted organisations should have regard to the following issues:

  • The proposed OP Code will prescribe how OP Organisations must comply with certain APPs (including the description of uses and disclosures of personal information in privacy policies, as well as notice and consent requirements). It will also impose further requirements on OP Organisations to stop using or disclosing information on reasonable requests, and with respect to their interaction with children or other vulnerable individuals.
  • Many of the changes that the Online Privacy Bill proposes to introduce through the OP Code in respect of OP Organisations echo similar reforms contemplated in the context of the discussion paper for the broader economy (e.g. introducing a right to object, and amending the Privacy Act to expressly provide that consent should be voluntary, informed, current, specific, and unambiguous and privacy notices be clear, current and understandable).
  • A breach of the OP Code would be treated as an interference with the privacy of an individual, exposing OP Organisations to strengthened penalties (of up to the greater of $10 million, 3 times the value of that benefit if determinable or 10% of the relevant yearly turn over) and reinforced enforcement mechanisms otherwise contemplated in the Online Privacy Bill and the Discussion Paper.
  • Particular restrictions regarding the use of the personal information of children align with similar rules under overseas data protection regimes including the EU General Data Protection Regulation (GDPR) and reflect a global regulatory focus on the safety of children using social media and the internet generally.

Our full briefing, which focuses on the implications under the Online Privacy Bill for a potential new OP Code and identifies the various types of organisations that will likely qualify as OP Organisations, can be found here.

Kaman Tsoi
Kaman Tsoi
Special Counsel, Melbourne
+61 3 9288 1336
Marine Giral
Marine Giral
Solicitor, Melbourne
+61 3 9288 1496

A new direction for the UK’s data protection regime? The devil is in the detail

A new dawn for UK data protection regulation is upon us, ushering a “golden age of growth and innovation” according to the UK Government. As Elizabeth Denham’s term as the UK Information Commissioner draws to a close at the end of this month, this is hot on the heels of the UK National AI Strategy and the DCMS’ Consultation Paper (Data: a new direction) for reform of the UK data protection regime.

The DCMS Consultation is  the first step in the Government’s plan to deliver on ‘Mission 2’ of the National Data Strategy, underpinned by the desire to boost innovation and economic growth for UK businesses while strengthening public trust in the use of data.

At 146 pages long, the DCMS Consultation is both comprehensive and wide-ranging. In this blog post we highlight some of the key proposals in the Consultation, alongside the similarly thorough ICO response to those proposals from last month.

Key takeaways:

  • The UK path: In a post-Brexit world, the DCMS’ proposed reforms have potential to significantly alter the data protection landscape in the UK. They aim to establish a “pro-growth and innovation friendly” data protection regime that is more practical and “business friendly“. The proposals intend to be more proportionate and flexible in nature, focussing on a more risk-based approach and representing a shift away from a “one size fits all” approach to compliance with data regulations. They mark a move away from a rigid set of rules, towards a more outcome focussed regime, in order to reduce burdens on business.
  • Wideranging reform: The proposals are expansive, seeking to create an adaptable and dynamic set of data protection rules that underpin the trustworthy use of data. The consultation concentrates on 5 key areas:
    1. Reducing barriers to responsible innovation, for instance by relaxing/simplifying the rules around organisations’ reliance on the legitimate interests condition to justify the processing of personal data (making it easier to rely upon in the context of conducting research and managing AI systems), clarifying the concept of data anonymisation and removing/limiting some of the restrictions currently placed on conducting automated decision making under Article 22 of the GDPR;
    2. Reducing burdens on businesses and delivering better outcomes for people by amending the current accountability framework with the introduction of risk-based “privacy management programmes” and removing existing requirements in relation to conducting DPIAs, appointing DPOs and maintaining detailed records of processing which align with Article 30 of the GDPR. The proposals also seek to increase the threshold for reportable data breaches to the ICO to breaches where there is a ‘material risk’ to individuals only.
    3. Reworking rules in relation to cookies and direct marketing including aligning the ICO’s fining powers under the PECR regime with those under GDPR;
    4. Boosting trade and reducing barriers to data flows including proposals around the use of alternative transfer mechanisms and adopting a more “risk-based” approach to granting UK adequacy decisions to other jurisdictions;
    5. Delivering better public services by allowing the processing of personal data for public health and emergency situations and providing guidance on the lawful grounds of processing; and
    6. Reforming the ICO to achieve the above goals by implementing new objectives and accountability mechanisms, for example by refocussing its statutory commitments away from handling a high volume of low-level complaints and towards addressing the most serious threats to public trust.

Further detail on the range of proposals is set out below at “The deeper dive: Key DCMS proposals, their impact and the ICO response“.

  • Pro-business in practice? Organisations have already invested considerable time and cost in their own GDPR compliance in recent years. Whilst the proposed reform is stated to “offer improvements within the current framework” and is earnest in theory, it remains to be seen whether the proposed divergence from the existing (EU-based) regime will in fact realise the benefits suggested by the DCMS, and whether it really is more “business friendly” in practice.

The proposals appear to be skewed more heavily towards benefiting smaller organisations in particular, which have historically struggled with the burden of data protection compliance. However, with the added administrative layer for organisations (first having to assess their current EU GDPR compliant practices against any new UK requirements, as well as exercise a greater level of discretion as to how best to comply), there is a risk that the reform may prove to be no less burdensome overall, at least at the outset.

And that is without factoring in the added layer of complexity for organisations operating across both a UK and EU footprint and needing to comply with dual diverging regimes – albeit that these multi-jurisdictional organisations may well continue to apply the potentially higher EU “gold standard” across all jurisdictions for consistency. Data protection compliance is not an exact science at the best of times and the proposed divergence may therefore unintentionally introduce further “grey” areas and a greater degree of uncertainty for organisations.

  • Adequacy: How far is too far? But perhaps the biggest question remains over whether the UK is able to maintain its EU adequacy status in the face of these proposed significant data protection reforms. Adequacy does not require a carbon copy replica of the EU GDPR framework and it may be that an element of divergence is possible if the UK continues to maintain a sufficiently protective data regime.

It is too early to tell at this stage. As the ICO response emphasises, as the proposals develop “the devil will be in the detail” (to ensure the final package of reforms adequately maintain rights for individuals). It will also be important to consider the overall impact of the package as a whole and how the various and plentiful proposals all fit together.

It is one thing removing burdens to organisations to deliver growth, but quite another if that then creates further barriers in the process; at its worst, loss of UK adequacy status, increased costs to organisations of using alternative transfer mechanisms and ultimately interrupting the free flow of data between the EU and the UK. A scenario that both the EU and the UK will ideally want to avoid. See “Adequacy: How far is too far? That is the question” below.

  • The ICO Response – supportive with some reservations: Whilst broadly supportive of the reform, the intent behind it and the proportionate risk-based approach (recognising that high data protection standards cannot remain “static”), the ICO’s response is peppered with numerous reservations; unsurprisingly taking a more data subject focussed stance – often seeking clarity on additional safeguards to be put in place to ensure data subject rights are not jeopardised and more generally welcoming further consideration of the proposals. It also has strong concerns around reform of the ICO’s own leadership structure, which potentially put the independence of the regulator at risk.

With the “pragmatic” current serving New Zealand Privacy Commissioner, John Edwards, taking the UK ICO helm from January of next year (and with a remit that goes beyond the regulator’s traditional role of focussing only on protecting data rights), it will be interesting to see how these reservations will be reconciled in the short term. Not least, given the DCMS is keen to finalise the proposals set out in the Consultation in the coming months.

The ICO also confirmed it is “crucial we continue to see the opportunities of digital innovation and the maintaining of high data protection standards as joint drivers of economic growth. Innovation is enabled, not threatened, by high data protection standards“.

  • Going against the grain? At a time when we are seeing increased data protection regulation at an international level, as well as territories looking to harmonise their data protection regimes, for now the UK seems to sit in contrast with its focus on divergence and deregulation.

Background: The UK balancing act

The issue of international data transfers has long been the main area of concern from a data protection perspective regarding Brexit; particularly whether or not the UK ensures an essentially equivalent level of data protection to that guaranteed under EU legislation. The European Commission’s adequacy decision confirmed the UK as an adequate jurisdiction for GDPR purposes on 28 June 2021. This allowed the free flow of data from the EU and EEA Member States to the UK without the need to put in place additional measures to legitimise the transfer (such as so called “EU Standard Contractual Clauses” or EU SCCs).

One of the key elements of the decision was that the UK’s data protection system continued to be based on the same rules that were applicable when the UK was a Member State of the EU. However, strong safeguards were also incorporated into the decision; these included the unique so-called “sunset clause” limiting the duration of the adequacy decision and the Commission’s close monitoring of how the UK system evolves (the Commission is entitled to suspend, terminate or amend the decision at any time in the case of problematic developments that negatively affect the level of protection found to be adequate). In turn, this has potential to restrict the extent to which the UK is able to diverge from the EU GDPR regime going forwards.

Since leaving the EU, there were suggestions that the UK may pursue a more relaxed, business-minded approach to data. In particular, the DCMS’ National Data Strategy and the Government’s “10 tech priorities” sought to pave the way for harnessing and “unlocking the value” of data across the economy. An approach mirrored and built on in the DCMS Consultation.

However, such an approach will clearly need to be carefully balanced against the UK’s position on data vis-à-vis the EU, particularly to ensure that any divergence from EU legislation is seen as sufficiently protective if the UK is to continue to benefit from the adequacy decision. See “Adequacy: How far is too far? That is the question” below.

Adequacy: How far is too far? That is the question

The “business friendly” intentions behind the Consultation indicate a clear intention to diverge from the EU regime and reform the UK rules on data protection so that “they’re based on common sense, not box-ticking. And…having the leadership in place at the Information Commissioner’s Office to pursue a new era of data-driven growth and innovation.”  (Digital Secretary, Oliver Dowden)

But how far is too far to diverge? When does a “business” and “innovation” friendly approach start to erode the level of protection afforded to data transferred from the EU to the UK and jeopardise the recently determined UK adequacy decision? It is one thing removing barriers to international data transfers in order to deliver growth, but quite another if that then creates further barriers in the process; at its worst potentially leading to the European Commission revoking the UK adequacy decision, increased costs to organisations of using alternative transfer mechanisms and ultimately interrupting the free flow of data between the EU and the UK. A scenario that both the EU and the UK will ideally want to avoid.

In the DCMS Consultation itself, the DCMS suggests it is possible to maintain adequacy – on the basis that adequacy does not mean a “word for word” replication of EU legislation – more a shared commitment to high standards of data protection. It cites examples of other EU adequate jurisdictions where this is the case.

The DCMS also makes it clear that any reformed regime will conform to high data protection standards and must be “underpinned by secure and trustworthy privacy standards”. Given the reservations in the ICO’s response, particularly those around data subject rights, as the proposals develop it remains to be seen whether the proposed reform does in fact sufficiently prioritise maintaining public trust in the UK’s data protection regime and, in turn, prioritise the UK’s adequacy status.

However, the Consultation is also accompanied by an impact assessment which includes the direct financial impact on UK businesses if the UK were to lose its EU adequacy status; this totals £1.4 billion over five years (the period in which compliance and SCCs would feed through to affected organisations). There is clearly a fair amount at stake if the UK Government get it wrong.

The Deeper Dive: Key DCMS proposals, their impact and the ICO responses:

The proposals set out in the Consultation are categorised into five key areas below:

i. Reducing barriers to responsible innovation: the proposal seeks to create an adaptable and dynamic set of rules that are flexible enough to be interpreted quickly and clearly in order to fit the fast-changing world of data-driven technologies – in doing so, supporting the Government’s pro-growth, pro-innovation stance. Proposed initiatives include:

  • Easier and more certain reliance on legitimate interests as a lawful basis for processing personal data, through the creation of a limited, exhaustive list of legitimate interests for which organisations can use personal data without applying the balancing test. Whilst the processing would still need to be proportionate and necessary for the stated purpose, this would create a new set of legal bases which would satisfy the legitimate interests test and would give organisations more confidence to process personal data without unnecessary over-reliance on (the transitory and often more challenging to obtain) consent basis, as is currently the case.

Amongst other categories, the proposed list includes “ensuring bias monitoring, detection and correction in relation to AI systems”, “using audience measurement cookies or similar technologies to improve web pages that are frequently visited by secure users” and “using personal data for internal research and development purposes or business innovation purposes aimed at improving services for customers”. Whilst the Government acknowledges that any list would need to be sufficiently generic to “withstand the test of time”, it envisages updating the list via regulation-making powers, which seem likely to be invoked given the  limited and exhaustive nature of the current (albeit relatively uncontroversial) list.

  • AI and automated decision making: the Government recognises the ability for data-driven AI systems to bring huge benefits, alongside a need to deploy AI tools that innovate responsibly and manage data-related risks at every stage of the AI life cycle. The Consultation considers the interplay of AI technologies with the UK’s current data protection regime.

In particular, and perhaps most controversial, the DCMS invites evidence on proposals to remove or more tightly limit the restrictions on automated decision making (including profiling) under Article 22 of the GDPR (where the decision-making “produces legal effects concerning” an individual or “similarly significantly affects that individual”). The current restrictions include the right to human review, giving individuals the right to challenge and request review of a decision. Whilst the DCMS recognises safeguards are meaningful in some cases (e.g. high risk AI-derived decisions) especially as there is currently no clear approach and standards for wider AI governance, the current operation and efficacy of Article 22 are thought to be uncertain (with limited case law and guidance available in particular). The DCMS is mindful that these provisions need to keep pace with the likely evolution and proliferation of automated decision making and profiling use. This follows a previous recommendation from the Taskforce on Innovation, Growth and Regulation Reform to remove the provision in its totality, instead permitting use on the basis of legitimate interests or public interests (see above).

Another key area of reform in the use of AI, relates to anonymisation, and considering a clear legal test for determining when data will be regarded as “anonymous”; with organisations currently relying on the ICO’s code of practice of anonymisation and the recitals to the UK GDPR. The Consultation sets out a couple of options in this regard, including elevating recital 26 of the UK GDPR (based on the “reasonable likelihood” that the controller is able to identify the data subject, although this is unlikely to add much to the related ICO guidance on this area). The DCMS is also considering legislation to confirm that whether data is anonymous, is relative to the means available to the data controller to re-identify it (as per the approach in the CJEU case of Breyer v Germany when assessing whether dynamic IP addresses constitute personal data).

In an effort to maximise the ease with which organisations can share and process data responsibly, the Consultation supports the development and use of “data intermediaries” as well (e.g. entities that can provide technical infrastructure and expertise to support interoperability between datasets or act as mediators negotiating sharing arrangements between parties looking to share, access or pool data). Whilst this could potentially benefit data sharing particularly for research and development purposes (by introducing a new innovative data sharing framework within the existing data protection regime), there is limited information on this proposal currently in the Consultation itself.

The DCMS has also raised concerns regarding the scope and substance of ‘fairness’ as it applies to the development and deployment of AI under the existing data protection regime, determining that the concept may be best left to sector-specific regulation rather than the ICO. The UK’s forthcoming AI governance framework is likely to provide further clarity on this early next year.

  • Use of personal data for research purposes: among others, the proposal seeks to incorporate a clearer definition of ‘scientific research’ into legislation (principally based on the related recital), consolidate the related provisions, consider the appropriate lawful basis for scientific research and whether the regime should enable data subjects to provide broad consent in circumstances where it is not possible to fully identify the purpose of personal data processing at the time of data collection. The Consultation acknowledges that the data protection regime is currently challenging to navigate and the provisions relating to research are relatively complex and dispersed within the current data protection framework. This is thought to create both real and perceived barriers for organisations, which currently sit at odds with the National Data Strategy looking to encourage reform to support research in the UK.

Key ICO Response

The ICO understands the drivers for greater certainty around use of legitimate interests as a lawful basis. However, rather than removing the need for the balancing test, the ICO envisages a shift in responsibility to carrying out the test from organisations to the Government instead – therefore requiring the Government to feel confident that the processing on any such list does not have a disproportionate impact on data subject rights. The ICO therefore expects the nature, context and detail of the processing to be set out more clearly to provide organisations with the necessary certainty to determine whether their own activities are covered (including how the Government has assured itself that the processing on the list will not have a negative impact without the need for further case to case consideration of the balance). It also called for the Government to provide more detail on how this proposal would interact with the exercise of individuals’ rights (for example, the right to object to the processing of personal data).

The ICO also supports the Government’s proposals relating to research and recognises the need to build trust regarding the fairness of AI and automated decision making. However, it does not agree that removing the right to human review at Article 22 is in the interest of data subjects and feels that this is likely to reduce trust in AI. On the contrary, it suggests extending Article 22 to cover partly (as well as wholly) automated decision making instead. The ICO agrees that providing guidance through engagement with stakeholders (including on what constitutes a “legal or similarly significant effect”) will help clarify what is acknowledged as a complex area, as well as looking more closely at how transparency could be strengthened to ensure human review is meaningful.

ii. Reducing burdens on businesses: the proposal seeks to shift away from a ‘box-ticking’ compliance regime towards one which is more based on a proportionate, flexible and risk-based accountability framework. This would require an organisation to develop and implement a risk-based “privacy management programme” (“PMP“) that reflects the volume and sensitivity of personal information it handles and the type(s) of processing that it carries out. The PMP would include the appropriate policies and processes for the protection of information.

Whilst the proposed changes to the existing framework are relatively significant, in a marked deviation from the EU GDPR requirements, the perceived benefit to small and micro-businesses of “reducing burdens”, may well not be realised in practice. In particular, it is possible that the proposal simply substitutes existing accountability requirements, with similar (but no less onerous) ones, adding a further administrative layer for organisations; first having to assess their current EU GDPR compliant practices against any new UK requirements, as well as exercise a greater level of discretion as to how best to comply (albeit the DCMS has suggested the ICO may provide related guidance in order to assist).

In particular, the proposed amendments to the existing accountability framework include:

  • Removing the requirement to appoint DPOs and instead exercising discretion in designating a suitable individual, or individuals, to be responsible for the PMP and for overseeing the organisation’s data protection compliance. Whilst this is intended to drive “more effective data protection outcomes” (without the need for the individual to be sufficiently independent, as is currently the case), the DCMS acknowledges there is a risk that removing DPOs could significantly weaken internal scrutiny. Some organisations, e.g. those undertaking high risk processing, may therefore still opt to designate a DPO-type equivalent to independently monitor and assess their organisation’s data protection compliance (to help demonstrate its commitment to the accountability principle), but this would need to be in addition to the proposed individual responsible for the PMP.
  • Removing the requirement for organisations to carry out DPIAs, to allow organisations to adopt different approaches to minimising data protection risks that better reflect their specific circumstances. While the removal of DPIAs means there is an increased risk of organisations undertaking processing that is high risk without adequate prior assessment of the impact of the processing, the Government considers that this will be mitigated by having in place an appropriate PMP.
  • Removing the requirement for prior consultation with the ICO in advance of carrying out high risk processing that cannot be mitigated. Removing the immediate threat of enforcement action is envisaged to encourage and incentivise organisations to engage with the ICO for guidance on high risk processing.
  • Removing record keeping requirements under Article 30 by establishing personal data inventories which explain what personal data is held, where it is held, why it has been collected and how sensitive it is. The new requirements under the PMP would allow further flexibility in how best to do this depending on the organisation’s own circumstances.
  • Changing breach reporting requirements due to the administrative burden on the ICO as a result of over-reporting. This would involve a shift in threshold from reporting a breach unless it is unlikely to result in a risk to the rights and freedoms of natural persons, to a requirement to report a breach unless the risk to individuals is not material. The Government suggests the ICO will publish guidance and examples of what constitutes a non-material risk. This proposal also considers including a new voluntary undertaking process, similar to that in Singapore, which would allow organisations that are able to demonstrate a proactive approach to accountability, to provide the ICO with remedial action plans following a breach, and the ICO may authorise the plan without taking any further action.
  • Amending the data subject access request provisions to introduce a cost limit modelled on the Freedom of Information Act. The Consultation also proposes a nominal fee regime as was the case under the DPA 1998, which is stated not to undermine an individual’s right to access their personal data. The proposal largely seeks to address concerns from organisations that they are overburdened when processing subject access requests, particularly wide-ranging, speculative requests (e.g. as a means to circumvent strict disclosure protocols otherwise required under the Civil Procedure Rules).

Key ICO Response

The ICO acknowledges that there are ways in which the legislation can be simplified, particularly to ensure the regulatory and administrative compliance obligations are proportionate to the risk an organisation’s data processing activities represent. Whilst it welcomes the Government’s commitment to retaining the principle of accountability and is open to alternative approaches to ensuring accountability and demonstrate it, the ICO believes further work is required to demonstrate both the additional value that PMPs could deliver and whether the intended benefits could be achieved through more minor changes instead. Particularly in light of the potential disruption and additional burden for business that significant change to the existing approach could bring, given the considerable resource organisations have already put into their current approach. Adequate time and resources would be needed for any such transition to take place effectively.

The ICO further agreed there is a possibility for more flexibility regarding DPIAs, however it noted that any reform to risk assessments must not result in reducing the robustness or quality of the assessment. Accordingly, the ICO has called for additional information on how organisations can adequately assess data protection risk. The ICO also re-highlighted the benefits of appointing a DPO (given the significant expertise, value and assurance the role can bring to data protection compliance) and suggested that this should not be lost with the reforms. The ICO drew parallels to designated roles under other sectors, for example, an ‘approved person’ under the Finance Act or a Money Laundering Reporting Officer under the UK Money Laundering Regulations. The ICO considers that DPIAs and having an appointed DPO will derive greater value and protection for individuals than the Government’s current proposals. It seems likely that organisations will share the same reservations about the reality of these amendments to the accountability framework.

Regarding changes to data subject access requests, the ICO reiterated the importance of these requests, and that this is only set to increase with the increased collection, use and re-use of data supported under the reforms. Concerned as to whether the proposed changes would in fact inhibit the exercise of this right, the ICO requested further evidence to accurately assess the benefits and risks associated with the proposals – not least to avoid disproportionate outcomes for data subjects, including the most vulnerable. One of the ICO’s alternative suggestions to address the burden of subject access requests (through, for example, use of new technologies when procuring and configuring new IT systems, as well as streamlining internal data management processes), may well not be sufficient on its own in practice.

iii. Reworking rules in relation to cookies and direct marketing: Whilst the focus of the Consultation remains firmly on reform of the UK GDPR, some elements of the proposals touch on the Privacy and Electronic Communications Regulations (“PECR”). On the face of it, these appear relatively minor and are likely to be welcomed by organisations and data subjects alike, particularly with the momentum currently behind initiatives to reduce the current “cookie fatigue”. These proposals include:

  • Changes to cookies rules as the Government considers how best to balance issues relating to organisations’ ability to collect data to improve websites versus user complaints about the number of pop ups and impact on user journey. Two options are suggested.
    • The first would allow organisations to use analytics cookies and similar technologies without user’s consent (i.e. treated in the same way was “strictly necessary” cookies under the current legislation, which is the approach currently adopted in France).
    • The second, would permit organisations to store information on, or collect information from, a user’s device without their consent for other limited purposes (i.e. a possible list of exemptions).
  • Extending the existing soft opt-in relating to direct marketing (i.e. beyond just organisations where they have previously formed a relationship with an individual during a sale or transaction) to non-commercial organisations (such as charities or political parties) and perhaps as a result of a membership or subscription.
  • Increased enforcement under PECR (i.e. bringing fines in line with UK GDPR); allowing the ICO to issue fines of up to £17.5 million or 4% of global turnover (compared to the current £500,000 levy). This would align with the sanctions regime envisaged under the proposed EU ePrivacy Regulation which is still making its way through the European legislative process and, depending on the relative timing, the proposals have the potential to subject UK based organisations to these more stringent sanctions earlier than their European counterparts.

Key ICO Response

As made clear by Elizabeth Denham at the G7 summit earlier in the year, the ICO agrees that the current approach to cookie pop-ups is not practical for data subjects or organisations and welcomes change to the cookie rules. Whilst the ICO is broadly supportive of the two options proposed by the Government, it requests further clarity on how a possible list of exemptions (without requiring user content) would work in the context of the wider reforms in the Consultation – particularly in light of the list of legitimate interests for which organisations can use personal data without applying the balancing test (see above), which could have the overall impact of removing appropriate safeguards.

Regarding a proposal to use browser and non-browser based solutions, the ICO acknowledges that this will also require adequate enforcement measures are put in place to ensure that users’ preferences are sufficiently respected. The ICO saw benefit in extending the existing soft opt-in to non-commercial organisations, provided existing safeguards continued to apply. The ICO also urged the Government to further consider legislating against the use of “cookie walls” (which require users to consent to cookie settings in order to access an online service’s content), given these arrangements can have the effect of removing meaningful choice by data subjects and therefore give rise to a risk of unfairness to those subjects.

Unsurprisingly, the ICO supports the Government proposal to raise fines under PECR and would like to engage with the Government on the potential benefits and costs of bringing the whole of the PECR enforcement toolkit in line with that of the DPA 2018 as well. Again, if pursued and depending on timing, this has potential to give rise to disparity with UK based organisations subject to more stringent enforcement than their European counterparts.

iv. Boosting trade and reducing barriers to data flows: The Government’s ambition is for the UK to be a leader in digital trade and hopes to support international data flows as part of its plan to do so. Given the flurry of developments in the international data transfer arena in the last 12 months, this area remains one to watch at both the UK and EU levels; particularly the fallout from the Schrems II judgement, the EDPB guidance on supplementary measures, the new EU SCCs and the draft UK equivalents. The DCMS Consultation proposals add a further level of complexity (and divergence from the EU regime) in this area, and follow the DCMS’s UK Global Data Plans which included an ambitious programme of priority data adequacy assessments and partnerships (with countries including the US), as well as a UK approach to adequacy assessments – please refer to our related blog for further information.

In particular, the proposals in the Consultation suggest a “risk-based” approach to UK adequacy decisions, focussing more on outcomes, rather than slavishly comparing respective legislation and suggesting a greater focus on proportionality. The proposals suggest that adequacy regulations could be made even in respect of “groups of countries, regions and multi-lateral frameworks” (for example where they share harmonised data protection standards). The proposals aim to relax the requirement to review adequacy regulations every four years, instead placing an emphasis on ongoing monitoring of countries’ relevant laws and practices given that adequacy is increasingly seen as a “living mechanism”.

The Government is also considering legislative amendments to ensure the suite of alternative transfer mechanisms available to UK organisations in the UK GDPR (set out in Article 46 and that permit international transfers of personal data to countries that are not subject to an adequacy decision) are clear, flexible and provide necessary protections for personal data. In particular, this is with a view to developing:

  • proportionality (providing more detailed, practical support for organisations determining and addressing the risks facing data subjects in practice, particularly for smaller organisations). Other proposals also include introducing a “reverse transfer exemption” from the scope of the UK international transfer regime, to alleviate friction for UK businesses where an outbound data transfer is already subject to sufficient protection as part of the inbound transfer to the UK;
  • flexibility and future-proofing (to more adequately reflect the rapidly changing international transfers landscape, as opposed to the current exhaustive list of alternative transfer mechanisms). This will complement the work already underway by the ICO to support organisations to take better advantage of existing options for tailored transfer mechanisms, such as Binding Corporate Rules, Codes of Conduct and Certification Regimes. Other proposals include empowering organisations to create or identify their own alternative transfer mechanisms, as well as those listed in Article 46; likely to benefit organisations with complex data transfer requirements in particular, for example, designing and using bespoke contracts to permit safe international transfers, which would supersede the existing option to develop bespoke data protection clauses requiring approval from the ICO. This is similar to the approach adopted in New Zealand’s data protection regime. The Consultation also considers permitting repetitive use of derogations under Article 49, which could provide flexibility and assurance for organisations that need to rely on them in certain limited but necessary circumstances. Derogations are currently only used as a last resort to legitimise international transfers and, even then, only permitted in very limited circumstances and under specific conditions where adequacy and alternative transfer mechanisms are unavailable; and
  • interoperability (to ensure the UK regime is compatible with any potential new international transfer regimes regardless of the mechanisms they use to transfer data). Whilst a valid and important factor for organisations, it is currently unclear how, and the extent to which, this will be achievable in practice given the intention to diverge from the EU regime in particular and the related complexities in doing so. The proposals also include modifications to the certification schemes framework to provide for a more globally interoperable market-driven system that better supports the use of certifications as an alternative transfer mechanism.

Key ICO Response

The ICO appreciates the need for “real-time flows of data in the digital economy”, whilst also maintaining high standards of data protection in the UK. It supports the proposed risk-based, practical approach to balance these requirements and welcomes the idea of alternative approaches to ensure this is the case. However, the ICO also requests further clarity in a number of areas around the detail of how this risk-based approach and the proposed alternative approaches would work in practice – to fully understand the implications of reform in this area and what proportionate safeguards were intended, emphasising the importance to UK business of retaining its own EU adequacy status.

In particular, on permitting repetitive derogations, the ICO highlights a fine balance is needed. Where a transfer is repetitive and predictable, use of an alternative international transfer mechanism under Article 46 (wholly or partly) may be more appropriate. However, the ICO accepts, where this is not possible, reliance on a derogation may still be “necessary and proportionate”, provided adequate measures were put in place, such as requiring the data exporter to document the approach taken and safeguards. In light of the increased flexibility and range of transfer tools suggested as a whole under the reform, the ICO also highlighted the importance of considering the proposals as a whole package; not least given the reforms as a whole may reduce the need for flexibility around permitting repetitive derogations.

On the proposed reverse transfer exemption, whilst the ICO supports changes to reduce burdens in a proportionate manner, it suggests any issues faced by UK organisations when making these transfers may in fact be reduced following the outcome of the ICO’s consultation on international transfers and any revised guidance in light of its own interpretation of restricted transfers and extra-territorial effect of the UK GDPR. It therefore encouraged the Government to investigate how effective this exemption may be in reducing complexity in practice.

The ICO touched on its own “proactive action” in this area as well, focussing on the equally “risk-based practical approach” suggested in its proposed International Data Transfer Agreement and Transfer Assessment, which also sought “interoperability” to some extent with the new EU SCCs – please refer to our related blog here.

v. Delivering better public services: the Government wishes to use personal data for the purpose of improving the delivery of public services while also maintaining a high level of public trust. Proposals in this regard support easier data sharing – both between different public authorities, as well as between public bodies and private companies processing on their behalf. In particular, the Consultation clarifies that private companies, organisations and individuals who have been asked to carry out an activity on behalf of a public body may rely on that body’s lawful ground of processing the data and do not have to identify a separate lawful ground to legitimise the processing of personal data. This is intended to support further collaboration between the public and private sector, particularly in light of the benefits achieved during the COVID-19 pandemic.

Key ICO Response

The ICO agrees that data sharing can help public bodies and other organisations to deliver modern, efficient services that make individuals’ lives easier. It also acknowledges that certain safeguards are in place to ensure that public authorities and officials are accountable for determining that all relevant aspects of the public task lawful ground are satisfied and that public interest is protected. However, the ICO called for further clarity on the extent to which these would apply to private bodies in these circumstances to ensure that data subject rights are sufficiently protected.

vi. Reform of the Information Commissioner’s Office: the Government intends to improve the legislative framework that underpins the powers, role and status of the ICO, setting new and improved objectives and accountability mechanisms. This includes refocussing statutory commitments away from handling a high volume of low-level complaints and towards addressing the most serious threats to public threats. The new statutory framework is intended to set out the strategic objectives and duties that the ICO must fulfil when exercising its functions, including placing new duties on the ICO to have regard to economic growth, innovation and competition when discharging its functions. Amongst other suggestions, the Government also proposes a new governance model for the ICO, aligning with the structure adopted by other regulators such as Ofcom and the FCA (i.e. with a CEO and independent board). The Secretary of State would appoint the CEO and approve (or reject) ICO guidance.

Key ICO Response

Whilst the ICO supports some of the proposed changes (including strengthening the ICO’s supervision and enforcement powers and elements of the new duties when exercising its function), it also raised strong concerns with other elements – particularly reform of its leadership structure (and the proposed approval powers granted to the Secretary of State), potentially putting the independence of the ICO (from the Government) at risk. The ICO believes that in order to maintain and build public trust, the regulator must have the ability to regulate independently. This seems a valid concern and, given the need for the Government to work closely with the ICO to further develop the proposals under the reform, one which the Government will need to reconsider closely as part of its response to the Consultation.

Next steps: Spotlight on stakeholder responses

The proposals set out in the Consultation have the ability to significantly change the data protection landscape in the UK and, in turn, the compliance requirements for businesses operating in the UK – a particular headache for those needing to comply with the dual EU and UK regimes. However, the true impact of this “new dawn” on the full spectrum of businesses operating in the UK (particularly whether the intended benefits of the proposals are realistic in practice), will only be known further down the reformation process, once the detail of any legislative changes is published.

Either way, the UK Government is clearly making waves in forging its own data protection path ahead in the wake of Brexit, in some cases currently at odds with the ICO, its own data protection supervisory authority. It will be interesting to see how those pinch points, in particular, will develop and whether they can be reconciled with the significantly more data subject focussed views of the ICO. We therefore expect (and encourage) a wide range of stakeholder responses to the Consultation by the 19 November 2021 deadline. Watch this space.

Miriam Everett
Miriam Everett
Partner, Digital TMT, Sourcing and Data, London
+44 20 7466 2378
Duc Tran
Duc Tran
Of Counsel, Digital TMT, Sourcing and Data, London
+44 20 7466 2954
Claire Wiseman
Claire Wiseman
Professional Support Lawyer, Digital TMT, Sourcing and Data, London
+44 20 7466 2384
Kabir Hosein
Kabir Hosein
Trainee Solicitor, London
+44 20 7466 3769

EDPB and EDPS respond to the European Commission’s proposed artificial intelligence regulation

The European Data Protection Board (EDPB) and the European Data Protection Supervisor (EDPS) published a joint opinion on 18 June 2021 on the European Commission’s proposed artificial intelligence (AI) regulation. For further information on the European Commission’s proposal itself, please see our previous blog post here.

In the joint opinion, the EDPB and the EDPS appear to welcome the European Commission’s proposal to regulate the use of AI systems in the EU, viewing the regulation as necessary to protect and maintain the fundamental rights of EU individuals. However, the EDPB and EDPS stress that more work needs to be done to establish:

  • how the legal framework to regulate AI systems which encourages innovation whilst also protecting the fundamental rights of individuals, is going to operate;
  • the intrusive forms of AI which should be prohibited; and
  • the interaction between the new AI regulation and existing legislation, including the GDPR.

We set out our key takeaways from the joint opinion below.

Key Takeaways

  • Clarity needed over relationship with existing EU data protection law: As autonomous decisions are often made in reliance on personal data and data processing activities, the EDPB and EDPS find the current proposal’s failure to clearly define the interaction between the new AI regulation and existing EU data protection law to be lacking. In their view, there should be an explicit requirement for organisations seeking to develop or implement AI systems to ensure compliance with the GDPR.

From a practical perspective, this would mean that organisations seeking to develop or implement AI systems would need to incorporate privacy by design into every stage of their development and ensure that all data protection requirements can be met including in relation to transparency and the implementation of adequate technical and organisational measures. For example, data subjects would need to be informed if their personal data is to be used for AI training/predictions and in relation to the rights available to them under the data protection laws.

  • Prohibition of intrusive forms of AI – The EDPB and EDPS consider that certain uses of AI are contrary to the EU’s fundamental values such as those which lead to discriminatory practices or which have a negative impact on the ability for individuals to exercise freedom of expression and movement. A number of specific examples given include the use of AI in connection with:
    • interfering with the emotions of a natural person (except for well-specified use-cases such as for health or research purposes);
    • automated recognition of human features in publicly accessible spaces (which includes faces, fingerprints, DNA, voices, keystrokes and other biometric or behavioural signals) for large-scale remote identification in online spaces;
    • categorising individuals from their biometrics into clusters according to ethnicity, gender or political/sexual orientation; and
    • any type of social scoring.

The EDPB and EDPS deem such uses of AI to be highly undesirable and in the EDPB and EDPS’s view these ought to be prohibited entirely rather than merely classified as “high-risk” as per the proposal.

  • Risk assessment: The proposal sets out four categories of AI systems based on the risk that they present to the fundamental rights and safety of individuals (for more detail, please refer to our previous blog on the proposal here). The proposal also explains that organisations will only be subject to regulatory obligations and restrictions when AI systems are likely to pose a high level of risk to the fundamental rights and safety of individuals, something which will need to be assessed on a case-by-case basis.

The EDPB and EDPS note that the proposal’s emphasis on the potential impact of AI systems on individuals fails to address risks which apply to groups of individuals and society as a whole, such as group discrimination and the expression of political opinions in public spaces. Further, the EDPB and EDPS suggest that the concept of “risk to fundamental rights” to individuals should be aligned with the equivalent concept under the GDPR.

The EDPB and EDPS query the proposal’s pledge for an exhaustive list of high-risk AI systems to be maintained as this would need to be regularly updated to keep pace with evolving technology/use of AI systems.

  • Clarification of the EDPS’s role and the relevance of Data Protection Authorities (DPAs): Whilst the EDPB and EDPS welcome the designation of the EDPS as the competent authority and the market surveillance authority, they urge legislators to clarify the EDPS’ future role and responsibilities under the proposal as failing to do so could potentially threaten the EDPS’ ability to fulfil its obligations as data protection supervisor.

Furthermore, the EDPB and EDPS highlight that DPAs already enforce data protection legislation such as the GDPR and benefit from a pre-existing understanding of AI technologies and data. They therefore suggest that DPAs should be designated as additional national supervisory authorities under the proposal.

  • Lack of international law enforcement cooperation: The EDPB and EDPS welcome the extension of the proposal’s scope to cover the use of AI systems by EU institutions, bodies and agencies and ensure a coherent approach across the EU. However, they express concerns in relation to the exclusion of international law enforcement cooperation from the scope of the proposed regulation as this exclusion creates a significant risk of circumvention, for example by third countries or international organisations operating high-risk applications relied on by public authorities in the EU.
  • More autonomy required for the European Artificial Intelligence Board (EAIB): The EDPB and EDPS recognise the need for the proposed legal framework to be applied in a consistent and harmonised manner across the EU as overseen by the EAIB. However, they suggest that the EAIB will require more autonomy than is currently afforded to it in the proposal if it is to fulfil this role with a view to achieving such consistency across the EU. Furthermore, they urge legislators to introduce cooperation mechanisms between national supervisory authorities and provide a single point of contact for individuals and organisations wishing to raise concerns about the legislation.
  • Certification mechanism: The proposal suggests that organisations implementing or developing high risk AI systems will need to obtain a certification to demonstrate their alignment with the EU AI framework. In EDPB and EDPS’s view, it is unclear as to how this will work in practice and whether this process will align with the certification mechanism under Articles 42 and 43 of the GDPR.

The Way Forward

The EDPB and EDPS appear to view the proposal as a step in the right direction in relation to building a legal framework around AI. However, there is a lot of work to be done in the years before the proposal is passed into law to address all of the issues set out above.

More recently, the commission adoption feedback period closed on 6 August 2021 with more than 304 comments from stakeholders, which further indicates that the European Commission will need to revisit certain aspects of the proposal in addition to those set out above.

In the meantime, businesses seeking to make use of AI should keep an eye on any updates to the proposal. Those seeking to use AI to process personal data should also take note of the ICO’s newly released AI and Data Protection Risk Toolkit to get an idea of the ICO’s views on the associated risks and maintaining compliance with data protection laws when implementing AI systems.

Duc Tran
Duc Tran
Of Counsel, Digital TMT, Sourcing and Data, London
+44 20 7466 2954
Stefanie Lo
Stefanie Lo
Trainee Solicitor, London
+44 20 7466 3560

European Commission publishes final Article 28 clauses

Simultaneous with the European Commission publishing its final standard contractual clauses for the international transfer of personal data (see our blog post here for further information) (the “New SCCs“), they have now published a final set of standalone Article 28 clauses for use between controllers and processors in the EU, also termed ‘standard contractual clauses’ (the “Final Article 28 Clauses“) (available here). Continue reading