Whilst several state-level data privacy rights continue to come into effect in 2024, given that this is an election year in the US, it seemed unlikely that federal privacy legislation would be a priority for Congress when we put together our 2024 Data Predictions at the start of the year. That said, in a last-minute and surprise entry to this edition of our Data Wrap, on 7 April 2024, US House Committee on Energy and Commerce Chair, Cathy McMorris Rodgers, and Senator Maria Cantwell, chair of the Senate Committee on Commerce, Science and Transportation, released a discussion draft of a new piece of legislation, the “American Privacy Rights Act” (“APRA”) a draft bipartisan, bicameral federal privacy bill (press release is available here). This follows talks around its predecessor, the Data Privacy and Protection Act, having stalled since early 2023.

The press release makes it clear that the APRA is the “best opportunity we’ve had in decades to establish a national data privacy and security standard that gives people the right to control their personal information…It strikes a meaningful balance on issues that are critical to moving comprehensive data privacy legislation through Congress.” The comprehensive draft legislation is stated to “set clear national data privacy rights and protections for Americans, eliminates the existing patchwork of state comprehensive data privacy laws, and establishes robust enforcement mechanisms to hold violators accountable, including a private right of action for individuals.” It will be interesting to track the evolution of this new piece of legislation as the US tries to play “catch up” with the comprehensive national privacy frameworks in other international jurisdictions. We are in the process of reviewing the draft APRA and will publish a deeper dive of the legislation in due course.

On 18 March 2024, the Information Commissioner’s Office (the “ICO“), issued its Data Protection Fining Guidance (the “Guidance“) on how it will calculate fines under the UK data protection legislation. The Guidance replaces the sections about penalty notices in the ICO’s Regulatory Action Policy which was published shortly after the GDPR entered into force in 2018. It sets out the circumstances in which the ICO would consider it appropriate to issue a penalty notice, and explains, how the amount of a fine would be determined.

The Guidance is divided into three main sections: (i) statutory background; (ii) circumstances in which the ICO would consider fines appropriate; and (iii) a step-by-step guide on how the ICO calculates the appropriate amount of fine. The subsections underneath answer other common questions, such as how the concept of undertaking is defined, what happens if there are several infringements in play, and what are the relevant aggravating or mitigating factors that can increase or reduce the amount of the fine in practice. As the sanctions regime has been in place for several years, many of these matters have also been previously addressed by the Regulatory Action Policy or by the European Data Protection Board (the “EDPB“) guidelines on calculation of administrative fines. Nevertheless, transparency and more certainty on the level and triggers of enforcement from the supervisory authority is always welcome.

The Guidance offers clarity on some areas of previous debate – for example, the Guidance confirms that not only is the concept of an undertaking relevant for determining the appropriate maximum amount of fine applicable, the ICO may also hold a parent company jointly and severally liable for the payment of a fine levied against a subsidiary. For further information of the key takeaways please refer to our blog here.

Close regulatory scrutiny of adtech looks unlikely to wane in 2024. In our February Data Wrap we discussed the joint initiative of the Dutch, Norwegian and German Data Protection Authorities requesting a binding EDPB opinion on the so-called ‘pay or okay’ model. In addition, the UK ICO launched a “call for views” on its regulatory approach to the model. The concerns of the authorities circle mainly around the idea of freely given consent under the EU/UK GDPR. According to the EU/UK GDPR, consent should not be regarded as freely given if the data subject has no genuine or free choice or is unable to refuse or withdraw consent without detriment.

March saw the discussion take a new turn, as the EU Commission opened its first non-compliance investigation under the Digital Markets Act (the “DMA“), only two weeks after the implementation deadline for the DMA had passed. Of particular note, the Commission refers to organisations such as Meta potentially breaching the DMA rules on data combination. The DMA requires that the gatekeepers must obtain a user’s consent to use their personal data across different services, and that the consent must be “free”. The Commission is concerned that the consent is not free if the user is confronted with a binary choice of paying or consenting. According to the Commission, the non-consenting users should be provided with a less personalised alternative instead of the paid version.

The Commission investigation marks the opening of the third investigation into the pay or okay model; in addition to the data protection authorities’ initiative, the Commission was already looking into this practice under the new Digital Services Act (the “DSA“). The debate, however, crystalises on broadly the same themes: (i) can the pay or okay model give rise to freely given consent? (ii) is privacy becoming a premium feature or should it remain a basic right of an individual? and (iii) the balance between advertising and making a profit, and maximising profit by targeting individuals.

It will be interesting to see where each of these investigations end up, particularly given the need for a consistent approach across the different EU legal frameworks and Member States. We are still expecting an opinion from the EDPB on the matter as well – whilst to date no opinion has been published, on 19 March IAB Europe sent an open letter to the EDPB, which included a rebuttal that the pay or okay model amounted to “paying” for data protection rights. It is also possible that the final word will still come from the CJEU in years to come!

The European Court of Justice (the “CJEU“) handed down its judgement on the long and eventful IAB Europe case on 7 March. The case originally arose from a fine imposed by the Belgian Data Protection Authority (the “Belgian DPA“) in 2022 against IAB Europe. The fine was imposed after an investigation into IAB’s Transparency and Consent Framework (the “TCF“), that has been widely used in particular by the advertising industry throughout Europe. The framework manages users’ preferences for online personalised advertising and plays an important role in what is known as real time bidding, the ecosystem that enables the sending of personalised adverts to users. The Belgian DPA found that the TCF was in breach of several GDPR provisions. The CJEU however, did not accept the original Belgian DPA decision as such.

The Belgian DPA had sought much needed clarity on several data protection concepts regarding the TCF. In particular, the CJEU found that:

  • TC strings (the string of marks where the individual’s consent or preference is stored) can constitute personal data, if they can be linked to individuals (the judgement refers to where the information contained in a TC String is associated with an identifier, such as the IP address of the device of such a user, it can enable the creation of a profile of that user and effectively identify the person specifically concerned by the information); and
  • IAB Europe can be considered a joint controller (together with TCF participants) in relation to the creation and use of the TC strings but not necessarily in relation to the subsequent data processing, such as digital advertising or content personalisation, as the IAB cannot influence such processing.

The decision requires changes to the TCF and the IAB has already announced that it will soon provide information on next steps. Entities relying on the TCF in their marketing are likely to see more guidance from the IAB regarding information to be provided to data subjects, as well as joint controller agreements soon. However, it is worth noting that the CJEU decision must still be deliberated by the Belgian Court of Appeal, with a decision expected by the end of 2024 or early 2025. The earlier imposed suspension of the execution of the Belgian DPA’s decision will continue to apply in the meantime. We are currently reviewing the judgment in further detail and will be posting a further piece on this judgement in due course.

On 22 March 2024, the Cyberspace Administration of China (“CAC“) officially enacted the long-awaited Provisions on Facilitating and Regulating Cross-border Data Flows (the “Provisions“), which became effective on the same date. The Provisions relax the data export requirements by introducing modifications and exemptions to the three mechanisms for cross-border data transfer, namely: (i) CAC security assessment; (ii) China’s standard contract for outbound cross-border transfer of personal information; and (iii) personal information protection certification (collectively the “Cross-border Data Transfer Mechanisms“). For the key takeaways please refer to our blog post here.

A draft of the Provisions was released on 28 September 2023 for public consultation (see our previous blog for more information China relaxes measures on cross-border data transfers from China | Data notes (hsfnotes.com). For further reflections on Beijing’s 2023 data outbound practices refer to Justina Zhang’s insight here.

The EU’s proposed Digital Identity Regulation (Procedure 2021/0136(COD)) was adopted by the European Parliament on 29 February 2024 (see press release here), and the EU Council of Ministers on 26 March 2024 (see press release here), clearing the path for the introduction of digital wallets for EU citizens. First proposed in June 2021, the Digital Identity Regulation was introduced as a response to shortcomings in existing electronic identification and authentication systems in the EU, such as difficulties with cross-border authentications and the lack of transparency over how personal data is processed. The proposal is also in line with the EU’s other digital initiatives such as the Digital Decade policy programme 2030.

The Regulation, amends the scope of the existing eIDAS Regulation (Regulation (EU) 910/2014) which made a number of electronic trust services available to EU citizens, notably giving digital signatures the legal status of a wet-ink signature. The Regulation requires each member state to make a digital identity wallet (“Wallet”) available to its citizens by 2026 which allows EU citizens to authenticate and access public and private services, to store, share and e-sign documents, make wallet-to-wallet transactions, and to verify who is behind a website through qualified web authentication certificates (“QWACs“). The Wallet will need to adhere to the following principles:

  • Control over data – citizens will have access to a data privacy dashboard in the Wallet where they can select what data is shared, report any data breaches, and request that their data is deleted.
  • Voluntary – the Wallet will be provided on an opt-in basis to avoid discrimination (e.g. due to digital literacy).
  • Free of charge – issuance, use and revocation of the Wallet, and associated services (e.g. e-signatures, validation mechanisms etc.) will be provided to citizens at no charge.
  • Cross-border functionality – Member States will be required to accept Wallets issued by other Member States.
  • Open-source – the software components of the Wallet are to be made publicly available, noting that the latest version is already online on Github.

The EU launched four large-scale pilot projects in May 2023 to test the Wallet, focusing on day-to-day use cases such as accessing governance services (e.g. applications for passports or drivers licenses, filing taxes, accessing social security benefits), opening bank accounts, registering SIMs, signing contracts, presenting eID when travelling, and more. A prototype Wallet is also being developed in line with the requirements of the Regulation. Following its adoption, the Regulation will now be published in the EU’s Official Journal in the next few weeks and will enter into force 20 days after its publication.

On 4 March 2024, the Information Commissioner’s Office (“ICO“) published guidance on biometric recognition. As the biometric recognition systems industry is expected to reach $68.61 billion globally by 2025 with 80% of organisations interested in deploying this type of technology, the ICO has focussed its attention on privacy concerns arising out of the collection and use of biometric data.

While the UK GDPR contains a definition of “biometric data”, it does not explicitly define or regulate the concept of “biometric recognition”. This has historically caused confusion around how data protection law applies to biometric data and the new ICO guidance serves to address this regulatory grey area.

Relying on the International Standards Organisation’s definition, the ICO guidance defines biometric recognition as automated recognition of people based on their biological or behavioural characteristics, which aligns with the definition of special category biometric data in the UK GDPR.

The guidance clarifies that any use of biometric data in biometric recognition systems amounts to the processing of special category biometric data to which the rules around the processing of special category data under the UK GDPR apply. The guidance outlines the legal implications and recommended best practice when using biometric recognition systems including in relation to conducting data protection impact assessments, ensuring data accuracy, preventing bias, instituting safeguards, employing privacy enhancing technologies and adopting a data protection by design approach.

It remains to be seen how the new guidance will affect the ICO’s approach to enforcement action around the use of biometric recognition systems. However, it is likely that we will see the ICO take a more proactive approach in relation to regulating these systems going forward, especially in light of the enforcement order that the ICO issued shortly before issuing the new guidance against Serco and Co, prohibiting them from using a biometric recognition system for monitoring employee attendance.

The UK’s increased focus on biometric recognition systems aligns with what we are seeing in other countries, with data protection regulators in Israel, Hong Kong, and California as well as other domain regulators such as the Federal Trade Commission and the New York City Council ramping up their efforts to regulate the collection and use of biometric data. In addition, the final draft of the EU AI Act contains seven definitions relating to biometric data. The interplay between the new ICO guidance, the upcoming AI Act and local legislation and guidance will need to be observed by organisations seeking to deploy biometric recognition systems in the UK and overseas.

March also saw the European Parliament approve the EU Cyber Resilience Act (“CRA“) which sets out mandatory cybersecurity requirements for manufacturers and retailers of “products with digital elements” (“PDE”) placed on the EU market, to safeguard consumers and businesses buying or using these products. It is the first EU-wide legislation of its kind. PDE are broadly defined and could include end devices (such as laptops, smartphones, smart speakers etc), software (such as firmware and operating system) and components – both hardware and software – (such as computer processing units, video cards and software libraries). The CRA applies to all PDEs connected directly or indirectly to another device or network, except for specified exclusions such as open-source software or services that are already caught by existing rules (for example, medical devices, aviation and cars).

Digital hardware and software products constitute one of the key routes through which malicious actors can conduct a successful cyberattack and, in a connected environment, a cyber security incident in one product can swiftly affect a whole organisation or supply chain, potentially across multiple jurisdictions. The existing EU and Member State national regulatory landscape only partially addresses these cyber security risks, creating a fragmented legislative framework and potentially unnecessary and inconsistent requirements on organisations operating in multiple Member States. The CRA was announced in the 2020 EU Cybersecurity Strategy, and complements other legislation in this area, for example the EU NIS2 Framework. In particular, the CRA looks to address: (i) the inadequate level of cybersecurity currently inherent in many products, or inadequate security updates to those products and software; and (ii) the inability of consumers and businesses to currently determine which products are cybersecure, or to set them up in a way that ensures their cybersecurity is protected. The CRA also aims to: (i) harmonise the rules when bringing to market products or software with a digital component; (ii) provide a framework of cybersecurity requirements governing the planning, design, development and maintenance of such products, with obligations to be met at every stage of the value chain; and (iii) introduce an obligation to provide duty of care for the entire lifecycle of such products.

Among its requirements, the legislation provides that: PDE should have security updates installed automatically and separately from functionality updates; those products deemed to pose a higher cybersecurity risk should be examined more stringently by a notified body; and important and critical products will be put into different lists based on their criticality and the level of cybersecurity risk they pose (with stricter requirements applying to those categorised as having a higher cybersecurity risk). In terms of next steps, the CRA now needs to be formally adopted by the Council of the EU (expected to take place in April 2024 without amendment). The final version will then be published in the EU Official Journal and the CRA will come into force 20 days later. The majority of the provisions in the CRA will apply in full 36 months after the CRA comes into force (with vulnerability reporting obligations applying 21 months after the effective date).

The repercussions of non-compliance are significant, including administrative fines of up to the higher of €15 million or 2.5% of total worldwide annual turnover for non-compliance with the essential cybersecurity requirements set out in Annex 1 to the CRA.

In parallel, earlier in the year, the European Commission adopted the “European Cybersecurity Scheme on Common Criteria” (“EUCC“) to harmonise cybersecurity certifications across the EU. Whilst voluntary, the scheme will be important in demonstrating compliance under the CRA (as well as other key pieces of legislation such as NIS 2 and DORA). Organisations can apply for EUCC certification of their ICT products from 31 January 2025 and we are likely to start to see EUCC certification as part of supplier procurement processes going forward. And at the UK level, the UK consumer connectable product security regime (under the Product Security and Telecommunications Infrastructure (Product Security) Act 2022) comes into effect on 29 April 2024.

Miriam Everett
Miriam Everett
Partner
+44 20 7466 2378

Claire Wiseman
Claire Wiseman
Professional Support Lawyer
+44 20 7466 2267
Duc Tran
Duc Tran
Of Counsel
+44 20 7466 2954

Saara Leino
Saara Leino
Associate
+44 2074 663 525

Sara Lee
Sara Lee
Associate
+44 20 74662346

Ankit Kapoor
Ankit Kapoor
Graduate Solicitor (India)
+44 2074 663 336