A year in the life of GDPR: Statistics and stories from the ICO

The introduction of the GDPR on 25 May 2018 caused a widespread re-think about data protection and privacy rights. From individuals being more aware of their rights, to corporate institutions working hard to ensure compliance and avoid the hefty new penalties the regulations can impose, data protection has undoubtedly been at the forefront of people’s minds since May 2018. At the heart of these changes, from the UK’s perspective, is the Information Commissioner’s Office (the “ICO“), who are the supervisory authority responsible for overseeing all data protection concerns and processing based in the UK. A year after coming into effect, we’ve taken a look at the impact that the GDPR has had on the ICO and its activities, looking at key differences between the years before and after the regulations were introduced. Continue reading

Marriott/Starwood Data Breach: ICO intention to issue another big £99 million ‘mega fine’

  • Just one day after its notice of its intent to fine British Airways £183.39 million, the ICO has issued a further notice of intent to fine Marriott International £99.2 million for its own data breach;
  • The systems of the Starwood hotel group were originally compromised in 2014, prior to the acquisition of Starwood by Marriott in 2016 – the breach itself was not discovered until 2018 following completion of the corporate acquisition;
  • The fine shines a spotlight on the importance of data and cyber due diligence in corporate transactions;
  • No details have yet been published by the ICO regarding the specific GDPR infringements involved;
  • Marriott now has the chance to respond to the notice of intent, after which a final decision will be made by the ICO.

Continue reading

Schrems II heard in Europe: potential huge impact on global data transfers

  • The Court of Justice of the European Union (“CJEU“) has heard oral submissions in the latest case questioning the legal validity of international data transfer mechanisms under the GDPR, such as Standard Contractual Clauses and the EU-US Privacy Shield;
  • The Irish Data Protection Commissioner (“DPC“) is seeking a ruling that would find the so-called Standard Contractual Clauses, which are used to legitimise the transfer of personal data from Europe all around the world, as invalid because they do not provide adequate protection for individuals’ data;
  • The CJEU heard yesterday from the DPC, Facebook, the Electronic Privacy Information Center, DigitalEurope, the Business Software Alliance, the European Commission, the European Data Protection Board, the US government, several EU Member States and representatives of the original complainant Mr Schrems;
  • The Advocate General will give his non-binding opinion on the case on 12 December this year, with a full decision expected from the CJEU by early 2020;
  • If the Standard Contractual Clauses are declared invalid, this will have a huge impact on global trade, effectively putting the brakes on the international transfer of data.

Continue reading

British Airways Data Breach: ICO announces potential £183 million ‘mega fine’

  • The ICO has published a notice of its intent to fine British Airways £183.39 million for its 2018 data breach where the personal data of 500,000 customers was stolen by hackers;
  • This is the first ‘mega fine’ issued by a European data regulator since the implementation of the GDPR;
  • The ICO acted as lead supervisory authority and has confirmed that it has been liaising with other EU privacy regulators;
  • No details have yet been published by the ICO regarding the specific GDPR infringements involved;
  • British Airways now has the chance to respond to the notice of intent, after which a final decision will be made by the ICO.

Continue reading

Cookie consent walls crumble: ICO publishes guidance on cookie consent

Following its recent admission that its own cookie consent mechanism was non-compliant (see previous blog post here), the UK privacy regulator (the ICO) updated its cookie notice last week (see our previous blog post here) and has now published guidance on cookies and similar technologies. Key messages are:

  • No implied consent for non-essential cookies allowed, including consent obtained via sliders/toggles which are defaulted to ‘on’
  • Analytics cookies are not ‘strictly necessary’ and so require consent
  • The position regarding the use of ‘cookie walls’ to restrict website access remains unclear, although is likely to be inappropriate in many circumstances

Continue reading

Cookie Compliance: How can companies get it right when the regulator does not?

  • The UK privacy regulator has admitted that its own cookie consent process does not comply with the current GDPR and ePrivacy rules.
  • According to the regulator, a new process will be implemented during the week beginning 24th June 2019, which could give organisations a valuable insight into how to navigate the complex interaction between the GDPR and ePrivacy rules in a compliant manner.
  • The regulator has also promised detailed guidance on cookies “soon“.

Continue reading

Probing the rules on automated decision making: article published in Privacy and Data Protection Journal

The Privacy and Data Protection Journal has published an article by Duc Tran, Senior Associate from our Digital TMT, Sourcing & Data Team, exploring automated decision making under the General Data Protection Regulation (GDPR).

In recent times, forward-thinking organisations have sought to automate and optimise the effectiveness and efficiency of their operations and decision making processes using new and disruptive technologies such as AI and machine learning.  However, whilst the efficiency gains and other benefits may be considerable, it is important for these organisations to be aware of the legal implications of using such technology.

One of these considerations is the restriction on the use of machines and automated systems to make decisions about individuals. Continue reading

European court to rule on validity of GDPR Standard Contractual Clauses

  • The long-running challenge to the so-called EU Standard Contractual Clauses and the EU-US Privacy Shield, both used to lawfully transfer personal data outside of Europe, is now going to be heard by the European Court of Justice (“ECJ“) after an attempt to block the referral was rejected by the Irish Supreme Court.
  • The ECJ will now assess and opine on whether these methods of international data transfer satisfy the requirements of the GDPR, with the potential for either or both mechanisms to be struck down like the US Safe Harbor was in 2015.
  • If the court finds either method to be invalid, it would have a major impact on the cross border transfer of personal data, leaving companies with significant GDPR compliance issues and extremely limited options to be able to lawfully transfer data across national boundaries.

Continue reading

Happy GDPR-versary! Herbert Smith Freehills reflections on a year of GDPR regulation

The GDPR came into effect almost a year ago on the 25 May 2018. As the most significant reform of data protection law in Europe for over 20 years, the legislation raised expectations of a cultural shift in attitude to data privacy. A year on from the fanfare of implementation, this bulletin looks at key aspects of what we have seen and learnt since implementation, and what we can expect for the future.

Enforcement

Although we are still waiting for a ‘GDPR mega fine’, we have seen a EUR 50 million fine levied by the CNIL in France and there have also been some interesting enforcement decisions coming out of Europe in the first 12 months. There have been rumours of a fine matrix being developed by the regulators to help assess the level of fine to be imposed but, for now at least, it remains unclear how fines are calculated and when a ‘mega fine’ may be appropriate.

Interesting enforcement action to note so far includes:

UK: ICO finds HMRC to be in “significant” breach of data protection legislation but does not impose a fine

In May 2019, the ICO found HMRC in the UK to be in “significant” breach of the GDPR by processing special category biometric data (voice recognition data) without a lawful basis. However, instead of imposing a monetary penalty, the ICO issued an enforcement notice requiring HMRC to delete the relevant data by early June 2019. For more information on this enforcement action, see our blog post here.

Belgium: Court of Appeal asks CJEU for GDPR guidance on the ‘one stop shop’

In May 2019, the Belgian Court of Appeal asked the European Court of Justice for help interpreting the application of the GDPR’s ‘one stop shop’ and whether the designation by companies of a lead supervisory authority in Europe precludes any other European supervisory authority from taking enforcement action against that company. The results of the case will either open or close the doors for regulators across Europe to cast aside the one stop shop when looking to enforce GDPR compliance in their home jurisdiction. For more information on this enforcement action, see our blog post here.

Poland: When is it a disproportionate effort to provide a privacy notice?

In April 2019, the Personal Data Protection Office in Poland issued a €220,000 fine to a digital marketing company for breaching its obligations under Article 14 of the GDPR (i.e. to provide a privacy notice to individuals). The decision has some important practical implications for organisations, including that: (i) the collection of publicly-available information from the internet does not relieve you of your obligations under the GDPR; (ii) a significant cost (in this case €8 million) involved with providing privacy notices to individuals is not sufficient to be able to rely on the ‘disproportionate effort’ exemption under Article 14; and (iii) the GDPR is not prescriptive about how individuals must be provided with privacy information but the ‘passive’ posting of a notice on a website is unlikely to be sufficient where the individuals are unaware of the collection of their data. For more information on this enforcement action, see our blog post here.

Germany: German competition regulator takes enforcement action against Facebook for data issues

In a slight move away from privacy regulation, the German competition authority, the Federal Cartel Office, announced the results of its investigation into Facebook in February 2019. The decision highlights the ever increasing tension between competition and privacy regulation. The FCO found that Facebook had a dominant position in the German market for social networks, and abused this with its data collection policy. The FCO did not impose a fine on Facebook, but has instead required Facebook in the future to only use data from non-Facebook sources where it has users’ voluntary consent, the withholding of which cannot be used to deny access to Facebook. For more information on this enforcement action, please see our blog post here.

UK: First extra-territorial enforcement action commenced by the ICO

In October 2018, the UK data protection regulator, the ICO, issued its first enforcement notice under the GDPR. The notice was particularly noteworthy because it was issued against a company located in Canada, which does not have any presence within the EU. Despite the breaches being alleged, the enforcement notice was the first issued by the ICO relying on the extra-territorial provisions of the GDPR under Article 3. For more information on this enforcement action, please see our blog post here.

Guidance

For many companies, a frustrating aspect of GDPR compliance over the last year has been the uncertainty. One year on from GDPR implementation and many questions remain unanswered. But we have now started to see signs that fundamental questions may eventually be answered and new regulatory guidance is starting to drip feed through the process.

Interesting regulatory guidance published over the last year includes:

A global regulation? EDPB guidelines on GDPR’s extra-territoriality provisions

The expansive nature of the GDPR’s extra-territoriality provisions has resulted in many organisations outside of Europe questioning whether or not they are subject to the GDPR regime. The market has eagerly awaited any guidance in respect of how Article 3 of the GDPR should be interpreted, and so the draft EDPB guidance published late last year was welcomed by the data community and the market as whole. However, whilst the draft guidance answered certain questions about the application of the GDPR, it also left a number of gaps and so we are still awaiting the final version of the guidance in the hope that some of those gaps will be closed. For more information on this guidance, see our blog post here.

EDPB guidance on when processing is “necessary for the performance of a contract”

In April 2019, the EDPB published guidance on the ability of online service providers to rely on the fact that processing is necessary for the performance of a contract in order to legitimise their processing of personal data. Although aimed specifically at online services, the guidance will nonetheless be useful for all controller organisations looking to rely on this processing condition. The guidance adopts a fairly narrow approach to interpretation with an objective assessment of “necessity” being required as opposed to relying on what is permitted under or required by the terms of a contract. For more information on this guidance, please see our blog post here.

EDPB opinion on the interplay between GDPR and ePrivacy

With companies having completed their GDPR compliance programmes, thoughts are now turning to the next major piece of European regulation in the data privacy sphere, the proposed ePrivacy Regulation, and how ePrivacy interacts with the GDPR, particularly with respect to cookie consent and email marketing. In March 2019, the EDPB published an opinion on the interplay between GDPR and ePrivacy which, whilst interesting, also confirmed that the whole ePrivacy regime is currently being renegotiated at a European level and the new ePrivacy Regulation could further change the position outlined in the opinion. As such, the opinion itself appears to be of minimal use for companies. For more information on this guidance, please see our blog post here.

What’s still to come?

One year on from GDPR implementation and we’ve seen limited enforcement action and even less regulatory guidance, meaning that companies are still having to try and find their way through compliance without direction. Much remains unknown and unanswered but what can we expect (or hope) from the next 12 months?

Brexit

The Brexit issue rumbles on without much/any clarity or certainty. We know that an adequacy decision for the UK is extremely unlikely in the short term but whether or not an interim transition deal is achievable (including with respect to data protection and data transfers) remains unknown at this stage.

International transfers

Although the results of the EU-US Privacy Shield annual review in 2018 seem to confirm that the Privacy Shield remains intact for the short term, there remain significant uncertainties around the future of other compliant international data transfer mechanisms. In particular, the validity of the so-called Standard Contractual Clauses (“SCCs”) continues to be challenged through the courts which could result in the SCCs being struck down by the CJEU in the same way that the US Safe Harbor was in 2015.

Continuing on the theme of international transfers, we are also still awaiting the publication of updated versions of the SCCs. The current versions still refer to the 1995 Directive instead of the GDPR but cannot be amended for sense without the risk of invalidating them. There are rumours that the EU Commission has started to consider an update, including potentially updating the controller to processor SCCs to include Article 28 obligations. However, we have yet to see anything concrete coming out of Europe.

ePrivacy Regulation

As mentioned above, the ePrivacy Directive is currently being renegotiated and was originally intended to be ready in time for the GDPR implementation. However, the failure of the European institutions to agree on a number of issues has resulted in multiple delays and it now does not look likely that a draft will be agreed before the end of 2019/early 2020, meaning that the situation regarding cookie consent and email marketing is likely to remain uncertain for a considerable period of time.

Enforcement

As noted above, we are still awaiting a GDPR ‘mega fine’ but we also haven’t yet seen much in the way of significant volumes of enforcement action in order to be able to gain any meaningful insights into enforcement. There are rumours of significant enforcement actions in the pipeline from the ICO and the Irish Data Protection Commissioner, and we also know that there have been a number of material personal data breaches since implementation of the GDPR, but we will have to wait and see what happens in year two of GDPR.

Individual rights and data disputes

Although the GDPR provided for enhanced data subject rights for individuals, we have also started to see it being used innovatively as a mechanism by individuals to assert other rights, including human rights and the right to privacy. We have seen Prince Harry assert that a news company’s photograph of him at home was in breach of GDPR, and a claim against the Police for their use of facial recognition technology has recently started in Wales. Going forward, we are therefore likely to see GDPR used as a tool in disputes. For more information about this, please see our blog post here.

Data breach compensation

Perhaps the elephant in the room sits with data breach compensation. In April 2019 the Supreme Court granted Morissons permission to appeal against the Court of Appeal ruling that it was vicariously liable for its employee’s misuse of data, in the first successful UK class action for a data breach. Whilst the date for the Supreme Court’s hearing is still to be confirmed, the appeal is likely to take place during the course of 2020. For more information on the case, please see our blog post here.

New emerging technologies

The age-old issue of technological innovation outpacing the ability of legislation to keep up has reared its head only one year into the GDPR’s lifecycle. Organisations are having to apply the text of the GDPR to scenarios including blockchain technology, connected and autonomous vehicles and AI techniques that simply weren’t envisaged at the time of writing. In this rapidly evolving technological landscape, the need for regularly updated, up-to-the-minute official guidance in respect of these types of scenarios has never been greater but this will be an extremely challenging demand for the regulators to satisfy.

To keep up to date with the latest legal developments as they happen, please subscribe to our data blog here.

Contacts

Miriam Everett
Miriam Everett
Partner, Head of Data Protection and Privacy, London
+44 20 7466 2378
Claire Wiseman
Claire Wiseman
Senior Associate and Professional Support Lawyer, Digital TMT & Data, London
+44 20 7466 2267
Lauren Hudson
Lauren Hudson
Associate, Digital TMT & Data, London
+44 20 7466 2483

HKMA takes first step towards regulating the use of big data analytics and artificial intelligence in FinTech

The Hong Kong Monetary Authority (HKMA) has issued a circular to encourage authorised institutions to adopt the “Ethical Accountability Framework” (EAF) for the collection and use of personal data issued by the Office of the Privacy Commissioner for Personal Data (PCPD). A report on the EAF was published by the PCPD in October 2018 (Report), which explored ethical and fair processing of data through (i) fostering a culture of ethical data governance and (ii) addressing the personal data privacy risks brought by emerging information and communication technologies such as big data analytics, artificial intelligence and machine learning.

The EAF is expressly stated to be non-binding guidance, intended as a first step towards a privacy regime better equipped to address modern challenges. However, the HKMA’s circular arguably elevates the legal status of the EAF for authorised institutions. The HKMA is likely to incorporate the EAF into its broader supervision and inspection of authorised institutions. In particular, in construing the principles based elements of the Supervisory Policy Manual as it applies to FinTech, the EAF will undoubtedly have an influence going forward.

Tension between the value of data-processing technology and public trust

Big data has no inherent value in its raw form. Its value lies in the ability to convert that data into useful information for organisations, which can then generate knowledge or insight relating to clients or the market as a whole through data analytics or artificial intelligence. Ultimately, this insight results in competitive advantage. However, a tension exists between (i) developing data-processing technology to gain a competitive advantage; and (ii) addressing public distrust arising from the data-intensive nature of such technology.

As the Report highlights, the existing regulatory regime in Hong Kong does not adequately address the privacy and data protection risks that arise from advanced data processing. Big data analytics and artificial intelligence in particular pose challenges to the existing notification and consent based privacy legal framework. These challenges are not limited to the legal framework in Hong Kong. The privacy and data protection legislations on an international level are also ill-equipped to anticipate advances in data-intensive technology.

Data stewardship accountability

The PCPD sees the need to provide guidance on how institutions could act ethically in relation to advanced data-processing to foster public trust. It reminds institutions to be effective data stewards, not merely data custodians. Data stewards take into account the interests of all parties and consider whether the outcomes of their advanced data processing are not just legal, but also fair and just.

The PCPD also encourages data stewardship accountability, which calls for institutions to define and translate stewardship values into organisational policies, using an “ethics by design” approach. This approach requires institutions to have data protection in mind at every step and to apply the principles of privacy by default and privacy by design. Privacy by default means that once a product or service has been released to the public, the strictest privacy settings should apply by default. Privacy by design, on the other hand, requires organisations to ensure privacy is built into a system during the entire life cycle of the system. Ultimately, data stewardship should be driven by policies, culture and conduct on an organisational level, instead of technological controls.

Both the privacy by design and the privacy by default principles are mandatory requirements under the EU General Data Protection Regulation (GDPR). The legal development trend is for Asian-based privacy regulators to, whether by means of enacting new laws (e.g. India) or issuing non-mandatory best practice guidance to encourage data users to meet the higher standards under GDPR.

Data stewardship

The PCPD encourages institutions to adopt the three “Hong Kong Values”, whilst providing the option to modify each value to better reflect their respective cultures. The three Hong Kong Values listed below are in line with the various Data Protection Principles of the Personal Data (Privacy) Ordinance (Cap. 486):

(i)   The “Respectful” value requires institutions to:

  • be accountable for conducting advanced data processing activities;
  • take into consideration all parties that have interests in the data;
  • consider the expectations of individuals that are impacted by the data use;
  • make decisions in a reasonable and transparent manner; and
  • allow individuals to make inquiries, obtain explanations and appeal decisions in relation to the advanced data processing activities.

(ii)   The “Beneficial” value specifies that:

  • where advanced data-processing activities have a potential impact on individuals, organisations should define the benefits, identify and assess the level of potential risks;
  • where the activities do not have a potential impact on individuals, organisations should identify the risks and assess the materiality of such risks;
  • once the organisation has identified all potential risks, it should implement appropriate ways to mitigate such risks.

(iii)   The “Fair” value specifies that organisations should:

  • avoid actions that are inappropriate, offensive or might constitute unfair treatment or illegal discrimination;
  • regularly review and evaluate algorithms and models used in decision-making for any bias and illegal discrimination;
  • minimise any data-intensive activities; and
  • ensure that the advanced data-processing activities are consistent with the ethical values of the organisation.

The PCPD also encourages institutions to conduct Ethical Data Impact Assessments (EDIAs), allowing them to consider the rights and interests of all parties impacted by the collection, use and disclosure of data. A process oversight model should be in place to ensure the effectiveness of the EDIA. While this oversight could be performed by internal audit, it could also be accomplished by way of an assessment conducted externally.

International Direction of Travel

The approach outlined above is not unique to Hong Kong. In fact, at the time the EAF was announced by the PCPD in October 2018, the 40th International Conference of Data Protection and Privacy Commissioners released a Declaration on Ethics and Protection in Artificial Intelligence (Declaration) which proposes a high level framework for the regulation of artificial intelligence, privacy and data protection. The Declaration endorsed six guiding principles as “core values” to preserve human rights in the development of artificial intelligence and called for common governance principles on artificial intelligence to be established at an international level.

It is clear that there is a global trend toward ethical and fair processing of data in the application of advanced data analytics. For instance, the Monetary Authority of Singapore has formulated similar ethical principles in the use of artificial intelligence and data analytics in the financial sector, announced in November 2018. Another example is the EU’s GDPR’s specific safeguards related to the automated processing of personal data that has, or is likely to have, a significant impact on the data subject, to which the data subject has a right to object. Specifically, a data protection impact assessment assessing the impact of the envisaged processing operations must be carried out before such processing is adopted, if such processing uses new technologies and is likely to result in a high risk to the rights and freedoms of natural persons after taking into account the nature, scope, context and purposes of the processing.

Although this may appear to be a relatively minor development in Hong Kong, we see this as a step in a broader movement toward the regulation of AI and a sea change in the approach to data protection and privacy. The HKMA circular and the EAF are in line with the global data protection law developments, which are largely being led by the EU.

Hannah Cassidy
Hannah Cassidy
Partner, Head of Data Protection and Privacy, London
+852 2101 4133
Jeremy Birch
Jeremy Birch
Partner, Digital TMT, Sourcing and Data, London
+852 2101 4195
Sheena Loi
Sheena Loi
Senior Consultant, Digital TMT, Sourcing and Data, London
+852 2101 4146
Peggy Chow
Peggy Chow
Senior Associate, Digital TMT, Sourcing and Data, London
+65 6868 8054