German Regulator Publishes Schrems II ‘Checklist’

The Baden-Württemberg data protection authority (“LfDI”) has issued guidance to controllers and processors following the Schrems II judgement.  The guidance includes helpful, practical tips which entities can take with respect to their current and future international transfers. Whilst aimed primarily at organisations subject to the jurisdiction of the LfDI, the guidance may be helpful for organisations throughout Europe who are grappling with the impact of the Schrems II decision.

In summary, exporting entities which are supervised by the LfDI are expected to:

  1. review the instances in which they export personal data to third countries;
  2. contact their contractual partners or service providers to inform them of the consequences of the Schrems II case;
  3. check whether there is an adequacy decision for the relevant third country;
  4. research and consider the legal environment in the relevant third country;
  5. check if the SCCs which were approved by the European Commission can be used; and
  6. if so, verify that SCCs are in place and that there are additional transfer guarantees to supplement the SCCs.

In our view it is the underlined step 4 above that is likely to cause the most difficulties and this is an area where further guidance is required. An obligation on exporters to undertake due diligence on the complete legal environment in a third country (some of which may not be completely transparent) goes beyond what most organisations undertake at the moment and it is not clear how this will be achieved going forwards.

Amendments to the Standard Contractual Clauses

The LfDI also suggests that exporting controllers amend or supplement the controller-processor Standard Contractual Clauses in the following ways:

  1. Clause 4(f): The LfDI recommends that exporting entities inform affected persons that their data is being transferred to a third country which does not have an adequate level of protection not only when transmitting special categories of data, but when transferring any personal data in these circumstances. This notification should occur before or as soon as possible after the transfer;
  2. Clause 5(d)(i): The data importer should inform not only the data exporter, but also the data subject(s) of all legally binding requests from an enforcement authority to pass on the relevant personal data. If such contact is otherwise prohibited by law, the data importer should contact the supervisory authority and clarify the procedure as soon as possible;
  3. Clause 5(d): Data exporters should contractually oblige the data importer to refrain from disclosing personal data to third country authorities until the competent court orders or requires them to disclose personal data; and
  4. Clause 7(1): Exporting and importing entities should only include Clause 7(1)(b) (which allows the data importer to refer any dispute to the courts of the Member State in which the data exporter is established in the event that a data subject asserts rights as a third party beneficiary and/or claims for damages against the data importer based on the contractual clauses) and not include Clause 7(1)(a) which allows a data importer to refer the dispute to an “independent person”.

Although it is clear that ‘amendments’ to the Standard Contractual Clauses are not permitted, it has long been recognised that the clauses may be ‘supplemented’ with additional provisions provided that the effect of those provisions is not to amend the substantive content of the clauses themselves. As such, the suggested ‘amendments’ above (with the exception possibly of the rejection of clause 7(1)(a) of the Standard Contractual Clauses) should be lawfully possible. However, from first looks, it appears that there may be logistical challenges with some of the suggestions. For example, is it practical or even desirable for the data processor/data importer to have an obligation to notify data subjects of an access request received by a third country law enforcement agency? The processor is unlikely to have a direct relationship with the data subjects and may not even be able to contact them depending on the data being processed. There also remains the fundamental issue that nothing in a contract between exporter and importer is going to prevent law enforcement access.

That being said, whilst regulators across Europe published some initial thoughts and guidance immediately following the Schrems II judgement, this is the first piece of practical guidance that we’ve seen published by a supervisory authority. It will now be interesting to see whether other supervisory authorities and/or the EDPB follow a similar approach in their Schrems II guidance.

Miriam Everett
Miriam Everett
Partner, Head of Data Protection and Privacy, London
+44 20 7466 2378
Lauren Hudson
Lauren Hudson
Associate, Digital TMT, Sourcing and Data, London
+44 20 7466 2483

Schrems II: Reaction from European Regulators and Technology Companies Suggest an Uncomfortable Road Ahead for Transatlantic Data Transfers

To recap, last week, the European Court of Justice (“ECJ”) ruled that the Privacy Shield is invalid and placed significant emphasis on the due diligence which exporting controllers, recipients and supervisory authorities are expected to undertake in relation to transfers of personal data to third countries which are governed by the Standard Contractual Clauses (“SCCs”).  As foreshadowed in our initial reaction to the Schrems II judgement, and now that we’ve had the benefit of the full judgement and initial commentary from some of the European regulators, the EDPB, and some of the big tech companies, the immediate future of transatlantic data transfers appears to be uncertain and commentary is divided on what is and isn’t possible in light of this new judgement. Continue reading

UK SWITCHES TO DECENTRALISED APPROACH TO CONTACT TRACING APP

In a move that marks a major U-turn for the Government, the UK’s proposals for a centralised contact tracing app have been abandoned in favour of a decentralised model. The new model is based on technology developed by Apple and Google and replaces the original app designed by NHSX, which recently has faced criticism due to privacy concerns as well as technical issues and delays.

The UK follows Germany and Italy, who have already made the switch from centralised contact tracing apps to decentralised models. The UK’s health secretary, Matt Hancock, confirmed the news at the UK Government press conference last night.

To centralise or decentralise?

The UK Government had previously asserted the superiority of a centralised contact tracing model, but what exactly is the difference?

A ‘decentralised’ data model requires individual users to provide an anonymous ID to a centralised server. The user’s phone then downloads information from the centralised database and carries out contact matching and risk analysis on the phone itself before sending alerts to other users if necessary. Information on whether a user has come into contact with an infected person will be shared with that user, but not with the central server.

In contrast, a ‘centralised’ data model would require users to provide not only their own anonymous ID to a centralised database, but also to send any codes collected from other phones. The computer server then carries out contact matching and risk analysis using that information, making the decision as to whether someone is ‘at risk’ and sending alerts accordingly.

The UK’s previous preference for the centralised model was based on the belief that storing data in a centralised manner would promote a more considered approach to contact tracing based on risk factors, and would enable epidemiologists to use valuable data on the spread of the virus for further research. However, the centralised model was criticised for potentially encroaching on privacy by using more data than necessary, and using the data for purposes other than contact tracing.

What next?

NHSX, the health service’s innovation arm, has confirmed that its current leaders will step back from the project, and that Simon Thompson, current chief product manager at Ocado, will take over management of the new app.

While this move will be welcome to privacy campaigners and critics of the centralised model, concerns over the limitations of Bluetooth-enabled technology, as well as the uneasiness over allowing Apple and Google to control the UK’s response to the pandemic, may cause further obstructions to the eventual rollout of a UK-wide contact tracing app. The additional delays resulting from this change in approach may also result in a lower than ideal take-up rate, with much of the population of the view that the time for contact tracing has passed given the current downwards curve of the pandemic.

Miriam Everett
Miriam Everett
Partner, Head of Data Protection and Privacy, London
+44 20 7466 2378
Hannah Brown
Hannah Brown
Associate, Digital TMT, Sourcing and Data, London
+44 20 7466 2677
Katie Collins
Katie Collins
Trainee Solicitor, London
+44 20 7466 2117

COVID-19: ICO OPINES ON APPLE AND GOOGLE’S CONTACT TRACING TECHNOLOGY (UK)

On 17 April 2020, the ICO published an opinion by the Information Commissioner (the “Commissioner”) on Apple and Google’s joint initiative to develop COVID-19 contact tracing technology (the “Opinion”, available here).

Summary

  • The Commissioner found the CTF to be aligned with principles of data protection by design and by default.
  • Controllers designing contact tracing apps that use the CTF should ensure alignment with data protection law and regulation, especially if they process personal data (which the CTF does not require).
  • The Commissioner raised concerns regarding individuals assuming that the CTF’s compliance with data protection principles will extend to all aspects of the contact tracing app – which is not necessarily the case.
  • Therefore, it should be made clear to any app users who is responsible for data processing, especially if the app processes data outside of the CTF’s limited scope.
  • Data controllers designing CTF-enabled contact tracing apps must be transparent with potential and actual app users on the type of information they will be processing.
  • Finally, when it comes to a user’s ability to disable Bluetooth, the Commissioner observed that with regard to contact tracing apps in general: “a user should not have to take action to prevent tracking”.

As set out in our previous blogpost (available here), contact tracing is one of the measures being contemplated or implemented by European governments (including in the UK and Germany) in order to be able to put an end to lockdowns while containing the spread of the virus.

The scope of the Opinion was limited to the design of the contact tracing framework which enables the development of COVID-19 contact tracing apps by public health authorities through the use of Bluetooth technology (the “CTF”).

It is also worth noting that this Opinion has been published in the midst of a heated debate on contact tracing technology and fears that it may be used for mass surveillance – in an open letter published on 20 April 2020, around 300 international academics cautioned against creating a tool which will enable large scale data collection on populations.

How does the CTF work?

The CTF is composed of “application programming interfaces“ as well as “operating system level technology to assist contact tracing”. The collaboration between Apple and Google will result in interoperability between Android and iOS devices of apps developed by public health authorities using the CTF.

When two devices with contact tracing apps come into proximity, each device will exchange cryptographic tokens (which change frequently) via Bluetooth technology. Each token received will be stored in a ‘catalogue’ on the user’s device, effectively creating a record of all other devices a user has come into contact with. Once a user is diagnosed with COVID-19, and after they have given their consent, the app will upload the stored ‘catalogue’ of tokens to a server. Other users’ devices will periodically download a list of broadcast tokens of users who have tested positive to COVID-19. If a match is found between the broadcast tokens and the ‘catalogue’ of tokens stored on each user’s device, the app will notify the user that he/she has come into contact with a person who has tested positive and will suggest appropriate measures to be taken.

How does the CTF comply with data protection laws?

The Opinion finds that, based on the information released by Google and Apple on 10 April 2020, the CTF is compliant with principles of data protection by design and by default because:

  1. The data collected by the CTF is minimal: The information contained in the tokens exchanged does not include any personal data (such as account information or usernames) or any location data. Furthermore the ‘matching process’ between tokens of users who have tested positive for COVID-19 and tokens stored on each user’s phone happens on the device and therefore does not involve the app developer or any third party.
  2. The CTF incorporates sufficient security measures: The cryptographic nature of the token which is generated on the device (outside the control of the contact tracing app) means that the information broadcast to other nearby devices cannot be related to an identifiable individual. In addition, the fact that the tokens generated by one device are frequently changed (to avoid ultimate tracing back to individual users) minimises the risk of identifying a user from an interaction between two devices.
  3. The user maintains sufficient control over contact tracing apps which use the CTF: Users will voluntarily download and install the contact tracing app on their phone (although this may change in ‘Phase 2’ of the CTF as discussed below). Users also have the ability to remove and disable the app. In addition, the process of uploading the collected tokens of a user to the app once he/she has tested positive by the developer requires a separate consent process.
  4. The CTF’s purpose is limited: Although the CTF is built for the limited purpose of notifying users who came into contact with patients who have tested positive for COVID-19, the Commissioner stresses that any expansion of the use of CTF-enabled apps beyond this limited purpose will require an assessment of compliance with data protection principles.

What clarifications are required?

The Commissioner raises a number of questions on the practical functioning of the CTF, especially in respect of collection and withdrawal of user consent post-diagnosis. It is unclear how the CTF will facilitate the uploading of stored tokens to the app. Although consent will be required from the user, clarity is needed on: (i) management of the consent signal by a CTF-enabled app and (ii) what control will be given to users in this respect. In addition, the Commissioner lacks information on how consent withdrawal will impact the effectiveness of the contact tracing solutions and the notifications sent to other users once an individual has been diagnosed.

Issues for developers

The Commission will pay close attention to the implementation of the CTF in contact tracing apps. In particular, the CTF does not prevent app developers from collecting other types of data such as location. Although reasons for collecting other types of user information may be “legitimate and permissible” in order to pursue the public health objective of these apps (for example to ensure the system is not flooded with false diagnoses or to assess compliance with isolation), the Commissioner warns that data protection considerations will need to be assessed by the controller – this includes the public health organisations which develop (or commission the development of) contact tracing apps.

Another issue raised by the Commissioner is the potential user assumption that the compliance by the CTF with data protection laws will radiate to all other functionalities which may be built into contact tracing apps. In this regard, the Commissioner reminds app developers that, in addition to assessing data protection compliance in relation to other categories of data processed by the app, they will need to clearly specify to users who is responsible for data processing – in order to comply with transparency and accountability principles.

Finally, the Commissioner stressed that data controllers, such as app developers, must assess the data protection implications of both (i) the data being processed through the app and (ii) data undertaken by way of the CTF in order to ensure that both layers of processing are fair and lawful.

What has the ICO said about ‘Phase 2’ of the CTF?

‘Phase 2’ of development of the CTF aims to integrate the CTF in the operating system of each device. The Commissioner notes that users’ control, their ability to disable contact tracing or to withdraw their consent to contact tracing should be considered when developing the next phase of the CTF.

With regard to user’s ability to disable Bluetooth on their device, the Commissioner observes in respect of ‘Phase 2’ of the CTF, and contact tracing apps in general, that “a user should not have to take action to prevent tracking”.

How does this Opinion affect the development of Decentralized Privacy-Preserving Proximity Tracing protocol?

The Opinion can be applied to Decentralized Privacy-Preserving Proximity Tracing (or DP-3T) protocol in so far as it is similar to the CTF. The Commissioner states that the similarities between the two projects gives her comfort that “these approaches to contact tracing app solutions are generally aligned with the principles of data protection by design and by default”.

Insight

This Opinion is an important step in the development and roll out of contact tracing apps in the UK. As mentioned above, contact tracing is one of the tools necessary for the UK Government to lift the lockdown measures while minimising the impact of a potential second wave of infections. This has an indirect impact on the private sector as it will affect how and when employees will be able to go back to work.

The fact that the principles on which the CTF is based are compliant with data protection laws is crucial to the successful roll out of contact tracing apps. In order for these apps to be effective, they must be voluntarily downloaded by a large number of mobile users. Given the concerns around letting governments accumulate data on the population under the guise of putting an end to the pandemic, trust is a determining factor in this equation. The fact that the Commissioner is approving the foundation for these contact tracing apps will certainly play a role in gaining the public’s trust and its acceptance to give up some privacy rights in order to put an end to the current public health crisis.

Miriam Everett
Miriam Everett
Partner, Head of Data Protection and Privacy, London
+44 20 7466 2378
Hannah Brown
Hannah Brown
Associate, Digital TMT, Sourcing and Data, London
+44 20 7466 2677
Ghislaine Nobileau
Ghislaine Nobileau
Trainee Solicitor, London
+44 20 7466 7503

COVID-19: ICO publishes details of its regulatory approach during COVID-19 (UK)

The ICO has published details of its regulatory approach during the ongoing COVID-19 emergency; this is an approach which should reassure entities who are adapting to the economic and practical realities of operating in the current climate, as well as balancing their data protection obligations.  The UK regulator has continued to be reasonable and pragmatic, as outlined in our previous post in relation to response times to DSARs, and has stated that they are “committed to an empathetic…approach”.  Overall, the key takeaways from this guidance are that: Continue reading

Revised ePrivacy Regulation Draft introduces ability for organisations to rely on “Legitimate Interests” legal basis in relation to cookies

Another revised draft ePrivacy Regulation (“ePR”) was recently published which introduces the ability for organisations to rely on the “legitimate interests” legal basis to drop cookies on end users’ devices.

This change has been criticised by some commentators for ambiguities and watering down data protection rights despite accompanying safeguards. It remains to be seen if it will be retained in future draft iterations or indeed, the agreed version of the ePR, in relation to which there is no clear timetable for implementation at present.

Background

First published in January 2017, the ePR covers specific data regulation reforms such as cookies, electronic direct marketing, over-the-top services and machine-to-machine communications. The overall approach, including a more stringent sanctions regime, would bring ePrivacy regulation into much closer alignment with the GDPR and was originally intended to coincide with the GDPR’s implementation in 2018.

Despite revised proposals from numerous Presidencies of the Council of the European Union, Member States have been unable to agree a final version of the ePR. At the moment, this means that it is unlikely to take effect before 2023 as a grace period of up to 2 years will need to elapse following adoption of the final draft.

With regards to Brexit, since the ePR is unlikely to be effective by the end of the transition period, it will not be incorporated into UK law under the withdrawal legislation (in contrast to the intended implementation of a UK GDPR). Therefore, the existing Privacy and Electronics Communications Regulations 2003 (“PECR”) will continue to apply following the end of the transition period. Once the ePR takes effect, the UK may choose to mirror the drafting or bring in its own drafting which diverges from the ePR. In any event, the ePR (in its current form) will likely still have implications for UK organisations dealing with individuals in the EU due to its intended extra-territorial scope.

The Proposed Amendments to the Draft ePrivacy Regulation

The latest draft, which simplifies the text of the core provisions and further aligns them with the GDPR, was proposed by the Croatian Presidency when it became clear that the majority of the Member States would not support the existing text.

One of the key proposals has been the introduction of the “legitimate interests” ground for introducing cookies (or similar technology) on end users’ terminal equipment represent a notable change in position from prior drafts and a step away from the consent-based model dictated by the most recent ICO cookies guidance and implemented by most organisations via cookie banners preventing users from accessing a webpage until they have set their cookie preferences accordingly. Critics have argued that this consent model is flawed as their ubiquity is leading to users ignoring them and “consent fatigue”. The introduction of the “legitimate interests” legal basis expands on previous ePR drafts’ attempts to help address this problem although the latest drafting is subject to various safeguards including fairly restrictive commentary as to when the “legitimate interests” legal basis can be relied on (e.g. not where the end user is a child, the organisation intends to use cookies to collect special categories of data or where the cookies are used to profile end users).

Commentators have criticised the drafting which seems to contain some inconsistencies. Firstly, it directly contradicts the EDPB’s statement in May 2018 that ePrivacy Regulation should not allow processing “on open-ended grounds, such as “legitimate interests” that go beyond what is necessary for the provision of an electronic communications service.” The introductory text to the draft, conversely, states that proposed safeguards mean that the new legal ground remains “in line with the GDPR”. Furthermore, tech advertisers wishing to rely on the “legitimate interests” ground may do so on condition that the end user is provided with clear information and has “accepted such use”. How an end user would confirm acceptance in practice is however unclear and this seems to cut across the prohibition on using the ground for profiling purposes.

The new proposal clearly intends to address some of the more contentious drafting points and cater to business needs (e.g. advertising). Nonetheless, given the lack of agreement to date and the ambiguities in the drafting, it remains far from certain that this draft will become the enacted version of the ePR.

Miriam Everett
Miriam Everett
Partner, Head of Data Protection and Privacy, London
+44 20 7466 2378
Duc Tran
Duc Tran
Senior Associate, Digital TMT, Sourcing and Data, London
+44 20 7466 2954

Tamsin Rankine-Fourdraine
Tamsin Rankine-Fourdraine
Trainee Solicitor, London
+44 20 7466 7508

COVID-19: How governments are using personal data to fight COVID-19

Background

The COVID-19 outbreak has resulted in an unprecedented focus on the power of data to assist in resolving national emergencies. From health tracking, to volunteer coordination, to accurately identifying the vulnerable, data is being harnessed in both the public and private sectors to try to help bring COVID-19 under control and mitigate its impact. Continue reading

European Commission Publishes GDPR Roadmap

On 1 April 2020, almost two years after the General Data Protection Regulation (GDPR) entered into force, the European Commission published a roadmap for evaluating its application.

The roadmap specifically asks for feedback on the Commission’s strategy in dealing with the issue of international transfer of personal data to third countries, focussing on existing adequacy decisions, and the cooperation and consistency mechanism between national data protection authorities.

Why has the roadmap been published?

Article 97 of the GDPR requires the Commission to submit a report on the evaluation and review of the Regulation to the European Parliament and the Council by 25 May 2020.

The GDPR specifies that the Commission must examine the application and functioning of the mechanism for (i) transfers of personal data to third countries or international organisations under Chapter V and (ii) ensuring consistency and cooperation under Chapter VII.

Further, under Article 97 the Commission is required to take into account the positions and findings of the European Parliament and the Council, and if necessary to submit appropriate proposals to amend the GDPR in light of any technological developments.

What does the roadmap say?

The Commission has announced its intention to publish a report identifying issues in the application of the GDPR as requested by Article 97. The report will build on input from the Council, the European Parliament and the European Data Protection Board, as well as two earlier reports published by the Commission in 2017 and 2019, which both considered the protection of personal data since the GDPR had entered into force.

Publication of the roadmap launches a consultation process to gather input from citizens and stakeholders. As part of that process, the Commission will be sending detailed questionnaires to data protection authorities and the European Data Protection Board, as well as to the GDPR Multi-Stakeholder Group, a group made up of business and civil society representatives. The Commission will also conduct bilateral dialogues with the authorities of the relevant third countries.

Individuals, businesses or interested parties can register with the Commission here to provide feedback by 29 April 2020. Feedback will be published to the Commission’s website alongside a synopsis report explaining how any input will be taken on board or why some suggestions cannot be actioned.

What happens next?

The Commission has already received feedback from the Council on the application of the GDPR in a report setting out its position on 19 December 2019. In its report the Council raised a number of concerns, including the challenges of determining or applying appropriate safeguards in the absence of an adequacy decision and the strain of the additional work for supervisory authorities resulting from the GDPR cooperation and consistency mechanisms.

The Council’s report also flagged issues raised by individual Member States, such as the importance of considering the application of the GDPR in the field of new technologies and big tech companies, and the development of efficient working arrangements for supervisory authorities in cross-border cases.

The deadline for providing feedback on the roadmap from citizens and stakeholders is 29 April 2020. Following consultation with the relevant parties and receipt of any online feedback, the Commission will publish its report by 25 May 2020. There is no obligation on the Commission to take any steps following publication of the report. However, Article 97(5) states that the Commission shall ‘if necessary’ submit appropriate proposals to amend the GDPR.

Miriam Everett
Miriam Everett
Partner, Head of Data Protection and Privacy, London
+44 20 7466 2378
Katie Collins
Katie Collins
Trainee Solicitor, London
+44 20 7466 2117

Morrisons wins Supreme Court appeal against finding of vicarious liability in data breach class action

Today the Supreme Court handed down its decision in Wm Morrisons Supermarkets Plc v Various Claimants [2020] UKSC 12, bringing to its conclusion a case which had the potential to alter significantly the data protection and cyber security litigation and class action landscape.

The headline news is that Morrisons has been found not to be vicariously liable for the actions of a rogue employee in leaking employee data to a publicly available file-sharing website.

The judgment will likely result in a collective sigh of relief for organisations who have been watching closely to track their potential liability for data breach class actions. However, it is important to note that the Morrisons case and judgment is very fact specific; it does not close the door on data breach class action compensation as a whole. Boardrooms should still be examining the technical and organisational measures they have in place to prevent personal data breaches in order to reduce the risk of regulatory enforcement and class actions.

Background

In 2015 a former Morrisons employee was found guilty of stealing and unlawfully sharing the personal data (including names, addresses, bank account details, salary and national insurance details) of almost 100,000 of Morrisons’ employees with members of the press and with data sharing websites. At the time, the ICO investigated and found no enforcement action was required with respect to Morrisons’ compliance with the Data Protection Act 1998 (“DPA”).

Nevertheless, around 5,000 Morrisons employees brought a claim for damages – irrespective of the fact that they had not suffered any financial loss. Instead, the employees claimed that Morrisons was vicariously liable for the criminal acts of its employee and for the resulting distress caused to the relevant employees.

For full details of the background to the High Court and Court of Appeal arguments, please see our previous Morrisons data protection briefing. Following the conclusion of the Court of Appeal case, the Supreme Court subsequently granted leave to appeal, which led to a two day hearing in November 2019 and, ultimately, the judgment handed down today.

Decision

The Supreme Court overturned the Court of Appeal’s decision, finding that Morrisons was not vicariously liable for the employee’s unlawful actions. Lord Reed gave the judgment of the court, with which Lady Hale, Lord Kerr, Lord Hodge and Lord Lloyd-Jones agreed.

Lord Reed concluded that the courts below had misunderstood the principles of vicarious liability, and in particular had misinterpreted the Supreme Court’s judgment on vicarious liability in Mohamud v WM Morrison Supermarkets plc [2016] UKSC 11. That judgment was not intended to change the law on vicarious liability, but rather to follow existing authority including, importantly, Dubai Aluminium Co Ltd v Salaam [2002] UKHL 48.

In Dubai Aluminium, Lord Nicholls identified the general principle that applies when the court considers the question of vicarious liability arising out of an employment relationship: the wrongful conduct must be so closely connected with acts the employee was authorised to do that, for the purposes of the liability of the employer to third parties, it may fairly and properly be regarded as done by the employee while acting in the ordinary course of his employment.

Applying that test, vicarious liability was not established in this case. The Supreme Court found that the employee’s wrongful conduct was not so closely connected with the acts he was authorised to do that, for the purposes of assessing Morrisons’ liability, it could fairly and properly be regarded as being done in the ordinary course of his employment.

Lord Reed referred to the distinction drawn by Lord Nicholls in Dubai Aluminium between cases “where the employee was engaged, however misguidedly, in furthering his employer’s business, and cases where the employee is engaged solely in pursuing his own interests: on a ‘frolic of his own’, in the language of the time-honoured catch phrase.”

In the present case, he said, it was clear that the employee was not engaged in furthering Morrisons’ business when he committed the wrongdoing in question. On the contrary he was pursuing a personal vendetta against the company, seeking revenge for disciplinary proceedings some months earlier. In those circumstances, the close connection test was not met.

Although it did not affect the outcome, in light of the court’s findings, Lord Reed also considered Morrisons’ contention that the DPA excludes the imposition of vicarious liability in relation to data breaches under that Act and for the misuse of private information or breach of confidence – in effect that the DPA is a statutory scheme which “owns the field” in this respect. The Supreme Court rejected this argument, stating that: “the imposition of a statutory liability upon a data controller is not inconsistent with the imposition of a common law vicarious liability upon his employer, either for the breaches of duties imposed by the DPA, or for breaches of duties arising under the common law or in equity.”

Implications of the decision

Data privacy implications

Although this case was argued under the repealed Data Protection Act 1998, it will likely result in a collective sigh of relief for organisations now subject to the GDPR which expressly allows individuals to claim compensation for non-material damages (including distress) caused by non-compliance with the regulation. This is exactly the decision that many organisations wanted. In a world where companies can already be fined up to EUR 20 million or 4% of annual worldwide turnover for non-compliance with the GDPR, there are real fears concerning the potential for additional significant liability under class action claims for data breaches. Many organisations will be comforted by the steps that the Court has now taken to reduce the likelihood of such claims being successful.

However, it is important to caution against too much optimism. The Morrisons case was quite unique because the compensation claim was brought by individuals despite no regulatory action being taken by the ICO at the time of the data leak itself. The decision is no guarantee that similar claims would fail in circumstances where the regulator agrees that there has been a breach of the security requirements under the GDPR, such as has been the case when you look at some of the recent big data breaches we have seen which are starting to result in significant fines from the ICO.

Despite the claim not being successful in this instance, another, perhaps unintended, consequence of the case is that employers will be seriously considering what steps can to taken to mitigate against the risk of rogue employees leaking personal data. This may result in increased employee monitoring in the workplace, with all the data privacy implications that that may entail.

Employment implications

The “close connection” test for rendering an employer vicariously liable for an employee’s actions (as described above) is well established. What has been less clear is how broadly that test should be applied to the facts. For example, it was suggested that the Supreme Court’s ruling in Mohamud v WM Morrison Supermarkets meant that, where an employee’s role involves interacting with customers in some way, an employer might be vicariously liable for the employee’s conduct towards customers even if the employee engages in a wholly different nature of interaction from that envisaged (such as by using force or away from the usual work station) and regardless of motive.

Given there is no ‘reasonable steps’ defence against vicarious liability for torts, employers will welcome today’s ruling which rows back from that liberal interpretation of the test. The Supreme Court has made clear that the mere fact that the job provides the employee with “the opportunity to commit the wrongful act” is not sufficient to impose vicarious liability, nor is the fact that the wrongful act was the culmination of an unbroken temporal or causal chain of events regardless of the employee’s motive. Doing acts of the same kind as those which it is within the employee’s authority to do is not sufficient either.

The test is whether the wrongful act was sufficiently closely connected with what the employee was authorised to do. In Mohamud it was key that the employee was purporting to act on his employer’s business, threatening the customer not to return to the employer’s premises, and not acting to achieve some personal objective. In contrast, in the current case “the employee was not engaged in furthering his employer’s business when he committed the wrongdoing in question. On the contrary, he was pursuing a personal vendetta, seeking vengeance for the disciplinary proceedings some months earlier.” The ruling helpfully re-establishes that employers should not be liable for the acts of employees engaged on “frolics” of their own, pursuing their own objectives.

Cyber and data security implications

While the spectre of no fault liability presented by Morrisons has fallen away, there is still a significant risk from fault based claims. ICO is imposing substantial fines upon organisations for inadequate technical and organisational security measures. Claimants are cutting and pasting adverse findings from the ICO into claim forms. Organisations will have to make sure that the measures they are taking are appropriate, which will involve considering many factors including state of the art and the harm that may be caused if the security measures fail.

Class actions implications

Data breach class actions are on the rise in the UK and today’s judgment should be seen as a setback not a roadblock. Funders and claimant firms are looking to build class actions in relation to data breaches even where there is no specific evidence of individual damage. They are seeking damages for the whole class for “distress” or a standardised claim of loss of access to data and even a nominal damages award per claimant could lead to a significant amount over a class of tens or hundreds of thousands. Today’s judgment will not reverse that trend, but it will at least mean that companies who are themselves victims of data breaches by employees will not also face such claims on this basis alone.

The key question for the viability of those claims will be how much a bare data breach is “worth” by way of damages, even if there’s no other loss suffered by the victim. We will have to wait a bit longer now to find that out. The principles applied in misuse of private information cases may be helpful to the courts in considering this issue, given how little case law there is on damages in data protection claims.

Insurance implications

The judgment is good news for corporates and their insurers. The expectation of the courts below had been that insurance was the answer to the point that the judgment effectively helps achieve the rogue employee’s aim – namely to harm Morrisons. Insurers may therefore also be breathing a sigh of relief – but only up to a point. Vicarious liabilities for data breaches by rogue employees are insurable in principle, but these claims are not doomsday for the insurance market. That’s because the main risk for corporates – and therefore insurers – is direct liability claims and related losses, which continue apace on an upwards trajectory.

The good news for concerned corporates is that they can buy cyber insurance to cover data breach claims, whether for direct or vicarious liabilities, as well related losses such as costs of managing the incident, regulatory investigations and loss of profits if systems are impacted. However, risk transfer strategies within corporates vary, and that cover cannot necessarily be banked upon in all cases. The main challenge therefore remains – and is not answered here: how much cover would I need to buy for a reasonable worst case, and is that available at reasonable cost on a good wording. Given that the measure of damages is still unclear, this issue will continue to be wrestled with.

Miriam Everett
Miriam Everett
Partner, Head of Data Protection and Privacy, London
+44 20 7466 2378
Tim Leaver
Tim Leaver
Partner, Employment, Pensions & Incentives, London
+44 20 7466 2305
Julian Copeman
Julian Copeman
Partner, Disputes, London
+44 20 7466 2168
Greig Anderson
Greig Anderson
Partner, Disputes, London
+44 20 7466 2229
Andrew Moir
Andrew Moir
Partner, Global Head of Cyber Security, London
+44 20 7466 2773
Kate Macmillan
Kate Macmillan
Consultant, Disputes, London
+44 20 7466 3737
Lauren Hudson
Lauren Hudson
Associate, Digital TMT & Data, London
+44 20 7466 2483
Anna Henderson
Anna Henderson
Professional Support Consultant, Employment, Pensions & Incentives, London
+44 20 7466 2819
Maura McIntosh
Maura McIntosh
Professional Support Consultant, Disputes, London
+44 20 7466 2608