The not so mega ‘mega fine’: ICO fines British Airways £20 million for its 2018 data breach

  • The ICO has fined British Airways £20 million for breach of the GDPR in relation to its 2018 data breach.
  • This is a significant reduction in the original proposed fine of £183 million.
  • In the monetary penalty notice issued to British Airways, the ICO has confirmed that the reduction of almost 90% was only partially influenced by the effects of COVID-19 on the financial position of British Airways.
  • In contrast, the vast majority of the reduction appears to come as a result of the ICO having taken into account BA’s representations following its notice of intent, combined with a change of approach by the ICO which meant less of a focus on turnover as the driving factor in calculating fines.
  • The ICO has also published details of the specific GDPR infringements committed by British Airways which have been limited to breach of the integrity and confidentiality principle in Article 5 and the security obligations in Article 32 GDPR.
  • The moral of the story appears to be that it can be commercially worthwhile for controllers to push back robustly against any notice of intent.


As we reported here, in July 2019 the Information Commissioner’s Office (“ICO”) published a notice of its intent to fine British Airways a staggering £183 million for infringement of the General Data Protection Regulation (GDPR) as a result of its 2018 data breach where the personal data of around 500,000 British Airways customers was stolen by hackers.

Importantly, this was a notice of intent and not a final concluded fine. The Data Protection Act 2018 sets a strict deadline of six months for the ICO to convert this into a fine, although this period may be extended if the ICO and the proposed recipient of the fine agree to an extension. Multiple times the ICO and British Airways took advantage of this extension mechanism so that the final Penalty Notice was only published on 16 October 2020, more than a year after the initial notice of intent.

At the time, no reasons for any of the extensions were offered by either side, although it was understood from International Airline Group’s (IAG, British Airway’s parent company) Annual Report and Accounts 2019, and has now been confirmed by the final Penalty Notice, that British Airways made extensive representations to the ICO regarding the proposed fine and that there were multiple further information requests. The impact of COVID-19 also likely had its part to play in the extension.

At the time of the initial notice of intent, the proposed British Airways fine was touted as the first ‘mega fine’ to be issued by a European data regulator since the implementation of the GDPR. The biggest data protection fine previously issued by the ICO was £500,000, the maximum possible under the old legislation.

The first GDPR ‘mega’ fine: not so ‘mega’: a reduction of almost 90%

The ICO finally issued its Penalty Notice to British Airways on 16 October 2020, fining British Airways £20 million. While still the largest ICO fine to date, this is a significant reduction of almost 90% from the original figure of £183.39 million.

Although the Penalty Notice refers in a couple of places to the original intended fine of £183.39 million, very little is said in the notice regarding why exactly, the final fine has been reduced by such a significant amount. Instead, the notice effectively appears to start from scratch in calculating the final level of fine, taking into account the following factors in accordance with Article 83 GDPR and the ICO’s Regulatory Action Policy:

  • Financial Gain: BA did not gain any financial benefit or avoid any losses directly or indirectly as a result of the breach.
  • Nature and Gravity: The ICO considered the nature of the failures to be serious, affecting a significant number of individuals for a significant period of time (103 days).
  • Culpability: Although the breach was a not an intentional or deliberate act on the part of BA, the ICO found BA to be negligent.
  • Responsibility: The ICO found BA to be wholly responsible for the breaches of Articles 5 and 32 GDPR.
  • Previous Actions: BA had no relevant previous infringements or failures to comply with past notices.
  • Cooperation: BA fully cooperated with the ICO’s investigation.
  • Categories of Personal Data: Although no special category data was affected, the nature of the data, in particular payment card data, was nonetheless sensitive.
  • Notification: BA acted promptly in notifying the ICO of the attack.

Taking into account all of these factors above, the ICO considered that a penalty of £30 million would be appropriate starting point to reflect the seriousness of the breach, and the need for the penalty to be effective, proportionate and dissuasive in the context of BA’s scale and turnover. So far, there is no obvious reason why the fine is so much lower than the notice of intent.

The ICO did not consider there to be any aggravating factors to apply in order to increase the penalty and further did not consider it necessary to increase the penalty in order for it to be ‘dissuasive’.

Turning to any potential downwards adjustment, the ICO considered a 20% downwards adjustment (£6 million) to be appropriate, taking into account various mitigating factors, including:

  • The immediate steps to mitigate and minimise any damage to data subjects;
  • BA’s prompt notification of the breach to data subjects and relevant regulatory authorities;
  • The broad press coverage as a result of the attached will have likely raised awareness with other controllers of potential risks; and
  • The adverse effect on BA’s brand and reputation.

Finally, the ICO also explicitly acknowledged that the impact of COVID-19 on British Airways was taken into account when determining the level of the final fine, although this only accounted for a further £4 million downwards adjustment and does not therefore account for the vast majority of the reduction.

Details of the GDPR infringements

In its final Penalty Notice, the ICO focussed on BA’s breach of Article 5(1)(f) GDPR – the integrity and confidentiality principle – and Article 32 GDPR – security of processing. The previous notice of intent, had also found BA to be in breach of Article 25 GDPR – data protection by design and by default – but this was dropped in the final Penalty Notice.

From a penalty perspective, it is also interesting that the ICO rejected BA’s claims that the maximum fine should be 2% because of the conflict between breach of Article 5 (attracting a maximum 4% fine) and breach of Article 32 (attracting a maximum 2% fine) meaning that the principal of lex specialis should apply with the specific provision of Article 32 overriding the general provision of Article 5. The ICO instead found that the two provisions were distinct even if they did overlap, although it is fair to note that it made no difference in the context of the level of fine imposed in the end (which was significantly less than both 4% and 2% of annual worldwide turnover).

With respect to its security obligations, the ICO found that British Airways had “weaknesses in its security” that could have been prevented with security systems, procedures and software that were available at the time. None of the measures would have entailed excessive cost or technical barriers for British Airways, with some available through the Microsoft Operating System used by British Airways. Some of the numerous measures British Airways could have used to mitigate or prevent the risk of the attack include:

  • limiting access to applications, data and tools to only that which are required to fulfil a user’s role;
  • undertaking rigorous testing, in the form of simulating a cyber-attack, on the business’ systems; and
  • protecting employee and third party accounts with multi-factor authentication, external public IP address whitelisting, and IPSec VPN.

The attack path that the hackers used in the ICO’s view exposed a number of failings on the part of British Airways. The hackers were able to gain access to an internal British Airways application through the use of compromised credentials for a Citrix remote access gateway. The hackers were then able to break out of the Citrix environment and could then gain broader access to the wider British Airways network. Once there, the attacker was able to move laterally across the network, culminating in the editing of a Javascript file on British Airway’s website. This allowed the attacker to intercept and exfiltrate cardholder data from British Airway’s website to an external third-party domain which was controlled by the attacker.

One particular area of focus for the ICO was British Airway’s practice of storing credentials within batch scripts. The ICO did not accept British Airway’s submissions that this “aided functionality” or was “standard practice” and stuck to its position that this was not acceptable and there were other secure ways to achieve the same objectives.

As a result, the ICO was “satisfied that BA failed to put in place appropriate technical or organisational measures to protect the personal data being processed on its systems, as required by the GDPR“.

What is next?

British Airways must pay the fine to the ICO or exercise its right to appeal to the First-tier Tribunal in the General Regulatory Chamber within 28 days of the Penalty Notice. Interestingly, the Penalty Notice does not refer to the availability of any further discount for prompt payment, with such discount usually being lost if the fine is appealed. This may normally suggest that BA has agreed to settle with the ICO, although the Penalty Notice is clear that BA does not admit liability for breach of the GDPR.

There is also the potential that British Airways could face a fine or reprimand under the Payment Card Industry Data Security Standard (PCI-DSS) in relation to its collection and processing of payment card data. PCI-DSS compliance is required by all organisations which accept, process, store and/or transmit debit and credit cards. However, fines under PCI-DSS are not publicly available so it is unlikely it will be public knowledge if a PCI-DSS fine is levied against British Airways.

In conclusion, this is perhaps not the first ‘mega fine’ or tough GDPR enforcement from the ICO that commentators were expecting, but it is still a step in that direction and with some interesting guidance regarding the way in which the ICO may approach the calculation of fines (and enforcement more generally) in the future.

Miriam Everett
Miriam Everett
Partner, Head of Data Protection and Privacy, London
+44 20 7466 2378
Andrew Moir
Andrew Moir
+44 20 7466 2773
Chloe Kite
Chloe Kite
Associate, Digital TMT, Sourcing and Data, London
+44 20 7466 2540
Elena Hogg
Elena Hogg
Associate, London
+44 20 7466 2590


In a move that marks a major U-turn for the Government, the UK’s proposals for a centralised contact tracing app have been abandoned in favour of a decentralised model. The new model is based on technology developed by Apple and Google and replaces the original app designed by NHSX, which recently has faced criticism due to privacy concerns as well as technical issues and delays.

The UK follows Germany and Italy, who have already made the switch from centralised contact tracing apps to decentralised models. The UK’s health secretary, Matt Hancock, confirmed the news at the UK Government press conference last night.

To centralise or decentralise?

The UK Government had previously asserted the superiority of a centralised contact tracing model, but what exactly is the difference?

A ‘decentralised’ data model requires individual users to provide an anonymous ID to a centralised server. The user’s phone then downloads information from the centralised database and carries out contact matching and risk analysis on the phone itself before sending alerts to other users if necessary. Information on whether a user has come into contact with an infected person will be shared with that user, but not with the central server.

In contrast, a ‘centralised’ data model would require users to provide not only their own anonymous ID to a centralised database, but also to send any codes collected from other phones. The computer server then carries out contact matching and risk analysis using that information, making the decision as to whether someone is ‘at risk’ and sending alerts accordingly.

The UK’s previous preference for the centralised model was based on the belief that storing data in a centralised manner would promote a more considered approach to contact tracing based on risk factors, and would enable epidemiologists to use valuable data on the spread of the virus for further research. However, the centralised model was criticised for potentially encroaching on privacy by using more data than necessary, and using the data for purposes other than contact tracing.

What next?

NHSX, the health service’s innovation arm, has confirmed that its current leaders will step back from the project, and that Simon Thompson, current chief product manager at Ocado, will take over management of the new app.

While this move will be welcome to privacy campaigners and critics of the centralised model, concerns over the limitations of Bluetooth-enabled technology, as well as the uneasiness over allowing Apple and Google to control the UK’s response to the pandemic, may cause further obstructions to the eventual rollout of a UK-wide contact tracing app. The additional delays resulting from this change in approach may also result in a lower than ideal take-up rate, with much of the population of the view that the time for contact tracing has passed given the current downwards curve of the pandemic.

Miriam Everett
Miriam Everett
Partner, Head of Data Protection and Privacy, London
+44 20 7466 2378
Hannah Brown
Hannah Brown
Associate, Digital TMT, Sourcing and Data, London
+44 20 7466 2677
Katie Collins
Katie Collins
Trainee Solicitor, London
+44 20 7466 2117


On 17 April 2020, the ICO published an opinion by the Information Commissioner (the “Commissioner”) on Apple and Google’s joint initiative to develop COVID-19 contact tracing technology (the “Opinion”, available here).


  • The Commissioner found the CTF to be aligned with principles of data protection by design and by default.
  • Controllers designing contact tracing apps that use the CTF should ensure alignment with data protection law and regulation, especially if they process personal data (which the CTF does not require).
  • The Commissioner raised concerns regarding individuals assuming that the CTF’s compliance with data protection principles will extend to all aspects of the contact tracing app – which is not necessarily the case.
  • Therefore, it should be made clear to any app users who is responsible for data processing, especially if the app processes data outside of the CTF’s limited scope.
  • Data controllers designing CTF-enabled contact tracing apps must be transparent with potential and actual app users on the type of information they will be processing.
  • Finally, when it comes to a user’s ability to disable Bluetooth, the Commissioner observed that with regard to contact tracing apps in general: “a user should not have to take action to prevent tracking”.

As set out in our previous blogpost (available here), contact tracing is one of the measures being contemplated or implemented by European governments (including in the UK and Germany) in order to be able to put an end to lockdowns while containing the spread of the virus.

The scope of the Opinion was limited to the design of the contact tracing framework which enables the development of COVID-19 contact tracing apps by public health authorities through the use of Bluetooth technology (the “CTF”).

It is also worth noting that this Opinion has been published in the midst of a heated debate on contact tracing technology and fears that it may be used for mass surveillance – in an open letter published on 20 April 2020, around 300 international academics cautioned against creating a tool which will enable large scale data collection on populations.

How does the CTF work?

The CTF is composed of “application programming interfaces“ as well as “operating system level technology to assist contact tracing”. The collaboration between Apple and Google will result in interoperability between Android and iOS devices of apps developed by public health authorities using the CTF.

When two devices with contact tracing apps come into proximity, each device will exchange cryptographic tokens (which change frequently) via Bluetooth technology. Each token received will be stored in a ‘catalogue’ on the user’s device, effectively creating a record of all other devices a user has come into contact with. Once a user is diagnosed with COVID-19, and after they have given their consent, the app will upload the stored ‘catalogue’ of tokens to a server. Other users’ devices will periodically download a list of broadcast tokens of users who have tested positive to COVID-19. If a match is found between the broadcast tokens and the ‘catalogue’ of tokens stored on each user’s device, the app will notify the user that he/she has come into contact with a person who has tested positive and will suggest appropriate measures to be taken.

How does the CTF comply with data protection laws?

The Opinion finds that, based on the information released by Google and Apple on 10 April 2020, the CTF is compliant with principles of data protection by design and by default because:

  1. The data collected by the CTF is minimal: The information contained in the tokens exchanged does not include any personal data (such as account information or usernames) or any location data. Furthermore the ‘matching process’ between tokens of users who have tested positive for COVID-19 and tokens stored on each user’s phone happens on the device and therefore does not involve the app developer or any third party.
  2. The CTF incorporates sufficient security measures: The cryptographic nature of the token which is generated on the device (outside the control of the contact tracing app) means that the information broadcast to other nearby devices cannot be related to an identifiable individual. In addition, the fact that the tokens generated by one device are frequently changed (to avoid ultimate tracing back to individual users) minimises the risk of identifying a user from an interaction between two devices.
  3. The user maintains sufficient control over contact tracing apps which use the CTF: Users will voluntarily download and install the contact tracing app on their phone (although this may change in ‘Phase 2’ of the CTF as discussed below). Users also have the ability to remove and disable the app. In addition, the process of uploading the collected tokens of a user to the app once he/she has tested positive by the developer requires a separate consent process.
  4. The CTF’s purpose is limited: Although the CTF is built for the limited purpose of notifying users who came into contact with patients who have tested positive for COVID-19, the Commissioner stresses that any expansion of the use of CTF-enabled apps beyond this limited purpose will require an assessment of compliance with data protection principles.

What clarifications are required?

The Commissioner raises a number of questions on the practical functioning of the CTF, especially in respect of collection and withdrawal of user consent post-diagnosis. It is unclear how the CTF will facilitate the uploading of stored tokens to the app. Although consent will be required from the user, clarity is needed on: (i) management of the consent signal by a CTF-enabled app and (ii) what control will be given to users in this respect. In addition, the Commissioner lacks information on how consent withdrawal will impact the effectiveness of the contact tracing solutions and the notifications sent to other users once an individual has been diagnosed.

Issues for developers

The Commission will pay close attention to the implementation of the CTF in contact tracing apps. In particular, the CTF does not prevent app developers from collecting other types of data such as location. Although reasons for collecting other types of user information may be “legitimate and permissible” in order to pursue the public health objective of these apps (for example to ensure the system is not flooded with false diagnoses or to assess compliance with isolation), the Commissioner warns that data protection considerations will need to be assessed by the controller – this includes the public health organisations which develop (or commission the development of) contact tracing apps.

Another issue raised by the Commissioner is the potential user assumption that the compliance by the CTF with data protection laws will radiate to all other functionalities which may be built into contact tracing apps. In this regard, the Commissioner reminds app developers that, in addition to assessing data protection compliance in relation to other categories of data processed by the app, they will need to clearly specify to users who is responsible for data processing – in order to comply with transparency and accountability principles.

Finally, the Commissioner stressed that data controllers, such as app developers, must assess the data protection implications of both (i) the data being processed through the app and (ii) data undertaken by way of the CTF in order to ensure that both layers of processing are fair and lawful.

What has the ICO said about ‘Phase 2’ of the CTF?

‘Phase 2’ of development of the CTF aims to integrate the CTF in the operating system of each device. The Commissioner notes that users’ control, their ability to disable contact tracing or to withdraw their consent to contact tracing should be considered when developing the next phase of the CTF.

With regard to user’s ability to disable Bluetooth on their device, the Commissioner observes in respect of ‘Phase 2’ of the CTF, and contact tracing apps in general, that “a user should not have to take action to prevent tracking”.

How does this Opinion affect the development of Decentralized Privacy-Preserving Proximity Tracing protocol?

The Opinion can be applied to Decentralized Privacy-Preserving Proximity Tracing (or DP-3T) protocol in so far as it is similar to the CTF. The Commissioner states that the similarities between the two projects gives her comfort that “these approaches to contact tracing app solutions are generally aligned with the principles of data protection by design and by default”.


This Opinion is an important step in the development and roll out of contact tracing apps in the UK. As mentioned above, contact tracing is one of the tools necessary for the UK Government to lift the lockdown measures while minimising the impact of a potential second wave of infections. This has an indirect impact on the private sector as it will affect how and when employees will be able to go back to work.

The fact that the principles on which the CTF is based are compliant with data protection laws is crucial to the successful roll out of contact tracing apps. In order for these apps to be effective, they must be voluntarily downloaded by a large number of mobile users. Given the concerns around letting governments accumulate data on the population under the guise of putting an end to the pandemic, trust is a determining factor in this equation. The fact that the Commissioner is approving the foundation for these contact tracing apps will certainly play a role in gaining the public’s trust and its acceptance to give up some privacy rights in order to put an end to the current public health crisis.

Miriam Everett
Miriam Everett
Partner, Head of Data Protection and Privacy, London
+44 20 7466 2378
Hannah Brown
Hannah Brown
Associate, Digital TMT, Sourcing and Data, London
+44 20 7466 2677
Ghislaine Nobileau
Ghislaine Nobileau
Trainee Solicitor, London
+44 20 7466 7503

COVID-19: ICO publishes details of its regulatory approach during COVID-19 (UK)

The ICO has published details of its regulatory approach during the ongoing COVID-19 emergency; this is an approach which should reassure entities who are adapting to the economic and practical realities of operating in the current climate, as well as balancing their data protection obligations.  The UK regulator has continued to be reasonable and pragmatic, as outlined in our previous post in relation to response times to DSARs, and has stated that they are “committed to an empathetic…approach”.  Overall, the key takeaways from this guidance are that: Continue reading

COVID-19: How governments are using personal data to fight COVID-19


The COVID-19 outbreak has resulted in an unprecedented focus on the power of data to assist in resolving national emergencies. From health tracking, to volunteer coordination, to accurately identifying the vulnerable, data is being harnessed in both the public and private sectors to try to help bring COVID-19 under control and mitigate its impact. Continue reading

COVID-19 People: Data comes to the fore as outbreak continues (UK)

The COVID-19 outbreak is proving an interesting time to be a data protection practitioner. There seems to be a new article each day about the next exciting app which promises to use data to help manage the crisis.

This post focuses on two particular propositions that pose interesting data protection considerations. It also flags the wider issues that developers should bear in mind when trying to respond to this unprecedented crisis.

Contact Tracing

It was reported on 31 March 2020 that the UK government is actively set to develop some form of contact tracing app in the near future. This follows successful app-based contact tracing in Singapore and South Korea. Led by NHSX, the innovation arm of the NHS, the app will leverage Bluetooth to identify individuals who have been in close proximity to each other, storing a record of that contact, and providing a mechanism through which an individual can be notified if they have been in close proximity to someone that tested positive for COVID-19. Given the anticipated use of Bluetooth, it is possible that NHSX may leverage Singapore’s TraceTogether app which used the same technology, the code for which was open-sourced by the Singapore government last week. TraceTogether was widely praised for collecting the bare minimum of data despite the extraordinary circumstances at hand.

The success of any tracing app will depend on a critical mass of users downloading it. Concerns are already being raised about whether private entities might require either employees or customers to use the app, to show they have not been in contact with infected individuals. It will also depend on a comprehensive testing regime to ensure that those who are symptomatic are tested quickly so that the notification can be sent appropriately quickly. Similarly, swift testing may help avoid people being unduly required to quarantine themselves having been in contact with someone with minor symptoms which do not turn out to be COVID-19.

It is interesting to note that initial statements from NHSX suggest that contacts will be stored on users’ phones, with notifications sent via the app after a suitable delay to avoid identification of the infected individual. It is not currently intended that the data would be sent regularly to a central authority, which may give comfort to people concerned about their privacy. Additionally, NHSX has indicated that it intends to appoint an ethics board to oversee this project.

COVID Symptom Tracker

ZOE, a health and data science company, in conjunction with Tim Spector, a genetic epidemiology professor at Kings College London, have created an app called ‘COVID Symptom Tracker’ that allows users to self-report potential symptoms of COVID-19, even if feeling well. The aim is to use this data to track the progression of the virus in the UK, and potentially identify high risk areas.

At the time of writing the app has been downloaded over 1.5 million times and is listed in Apple’s top 10 free apps in the App Store. The app requires individuals to provide data including age, sex at birth, height, weight, postcode of residence, pre-existing health conditions, and habits such as smoking. Each day, users then report how they are feeling against a list of known symptoms. It appears from the app’s privacy policy that unanonymised personal data may be shared with the NHS or King’s College London, whilst data shared with other entities is given an anonymous identifier.

The app is based on consent, both to the data processing and to potential transfers of personal data to the US. Data is collected for the following purposes related to COVID-19 including: (i) better understanding the symptoms; (ii) tracking the spread of the virus; (iii) advancing scientific research into links between patient health and their response to infection with the virus; and (iv) potentially to help the NHS support sick individuals. Whilst at an initial glance this seems like a reasonably narrow set of processing purposes, you could envisage a surprisingly broad range of activities which might fall within these categories, including specifically tracking individuals.

Data protection considerations

When it comes to processing personal data, the post-GDPR mantra is increasingly ‘Just because you can, doesn’t mean you should’. The principles of fairness, transparency, purpose limitation and data minimisation in particular will require serious consideration to ensure that the proposed data usage is justifiable.

Whilst the Secretary of State for Health & Social Care Matt Hancock recently tweeted that “the GDPR does not inhibit use of data for coronavirus response”, this may not necessarily be aligned with the ICO position that the GDPR is still in full force, despite the fact that the ICO may take a pragmatic approach where necessary. There are certainly lawful routes to using personal data to fight COVID-19, but this should be done based on clear reasoning and analysis.

With that in mind, the following key considerations may assist when evaluating whether or not to use personal data in the context of COVID-19:

  • be confident that you have an appropriate lawful basis for processing the personal data. Remember that both vital interests and substantial public interest are very high bars to satisfy. Likewise, legitimate interests always needs to be balanced against any potential impact on individuals’ rights and freedoms;
  • do not use personal data for extraneous purposes. You should aim to keep your processing purposes as narrow as possible for the stated aims, and be conscious that any attempt to use the dataset for non COVID-19 related reasons might be seen as acting in bad faith. Similarly, the collected data should be limited to what is strictly necessary for the processing purposes. Avoid the temptation to collect additional categories of personal data because they ‘may’ be useful in future;
  • the potential volume of data processing, and categories of personal data being anticipated, suggest that in relation to many of the COVID-19 related apps a data privacy impact assessment should be undertaken. These should be completed carefully and not rushed for the sake of getting an app into the live environment;
  • consider who personal data is shared with, and whether sharing a full dataset is strictly necessary. It may be possible to anonymise personal data such that the recipient only receives fully anonymised data, which may help manage data subject concerns about where their personal data might go. Remember however that true anonymisation is difficult and the pseudonymisation alone does not take data outside of the scope of the GDPR;
  • given the potentially higher risk processing that is taking place, it is important that data subjects understand how their personal data will be used, and who it may be shared with, particularly where they are giving up unusual freedoms such as in the context of tracking. Data controllers should aim to go above and beyond to ensure their fair processing information is clear and easy to understand, so that individuals have good expectations of how their data will be used;
  • if and when relying on data subject consent for any processing, it is likewise important to ensure that the individuals understand exactly what they are consenting to. Now more than ever it is vital that consent is specific, freely given, informed and explicit when dealing with sensitive health data;
  • personal data collected in the context of COVID-19 is generally required for the specific aim of managing the outbreak of the virus or its effects. This may mean that it is not necessary or appropriate to retain this personal data once the virus has been controlled and life returns to normal, depending on what has been communicated to data subjects; and
  • holding larger volumes of personal data, or special category data, potentially represents a higher security risk and there may be increased cyber attacks on the dataset. Ensure that you have appropriate additional security measures in place where necessary.
Miriam Everett
Miriam Everett
Partner, Head of Data Protection and Privacy, London
+44 20 7466 2378
Hannah Brown
Hannah Brown
Associate, Digital TMT, Sourcing and Data, London
+44 20 7466 2677


In these unprecedented times, COVID-19 has forced organisations to quickly put in to place measures with the aim of ensuring both business continuity and the protection of employees. In many instances, this has involved increased processing of health data, in ways that were not envisaged a short time ago. Organisations across the globe are also asking employees to work from home. Given the timeframes involved and speed at which government advice and directions have evolved, data protection regulators are recognising the challenges involved (please see the related article here), yet a global pandemic is not a general waiver for privacy compliance.

Here we explore some of the data privacy issues that organisations should be considering as they adapt to the COVID-19 crisis. For more information about general people issues, please see COVID-19: People – key issues for UK employers.

COVID-19 related data processing: key compliance issues

  • Lawful basis for processing for COVID-19 related activities

For all COVID-19 related activities involving the processing of health data of, whether it be as a result of: (a) employees voluntarily informing employers that they have tested positive for, or are suspected to have, COVID-19; (b) employers proactively asking employees about their health; or (c) other preventative measures introduced by employers (e.g. body temperature scanning for access on to premises), a lawful basis for processing is required under both Article 6 and Article 9 of the GDPR.

Article 6: The Article 6 ground which many organisations are likely to seek to rely on will be the “legitimate interests” of the organisation or third parties (e.g. other employees), provided that a risk assessment is carried out to check that any risks to individuals’ interests are proportionate. This should be documented in a legitimate interests assessment. It is, however, recognised that organisations are being required to respond rapidly to evolving guidance and it may not always be feasible to carry out such an assessment. Alternatively, an organisation may seek to rely on other lawful bases, such as:

    • the processing is “necessary to perform the employment contract”, if ensuring health and safety is a term of that agreement; or
    • the processing is “necessary to comply with legal obligations”, in relation to health and safety.

Article 9: As health data is considered ‘special category data’ under the GDPR, a lawful basis will also be required under Article 9 of the GDPR. It is likely that much of the processing will be necessary to carry out obligations in relation to employment law, insofar as it is authorised by Union or Member State law (Article 9(2)(b)). Other relevant grounds may also be “public health” and “preventative and occupational medicine”, again in each case insofar as authorised by Union or Member State law (Articles 9(2)(h) and (i)). As you will note, this aspect of the GDPR is devolved to Member States, meaning that local privacy and employment laws will need to be reviewed to assess what specific measures may be permitted locally when processing health data.

In respect of the UK, the UK Data Protection Act 2018 provides for these conditions at Schedule 1, Part 1, but imposes additional safeguards. For example, if relying on the basis that processing is necessary to carry out obligations in relation to employment law, the organisation must have an “appropriate policy document” in place, which should:

    • explain the organisation’s procedures for securing compliance with the principles set out in Article 5 of the GDPR; and
    • explain the organisation’s policies as regards retention and erasure of personal data, giving an indication of how long such personal data is likely to be retained.
  • Disclosing COVID-19 employee-related information

Where an employee has tested positive for COVID-19, an employer may wish to carry out ‘contact tracing’ amongst other employees, or alert other employees. However, unless it has the explicit and freely given consent of the employee who has tested positive, it should not be divulging the name of that employee to anyone else, although employers can still communicate that employees may have been exposed. The Information Commissioner’s Office (ICO) has indicated that employers that inadvertently share too much information in a bid to protect employees’ health will not be penalised, although the more cautious approach would not be to test this and to avoid disclosing the names of affected employees.

  • Proportionality and other considerations

The personal data that is processed should be limited to only what is necessary for the purpose of the response measure the organisation is implementing and making decisions as to action required. All other relevant GDPR principles and obligations will also need to be kept in mind and complied with – for example, data minimisation, the updating of Article 30 records, and appropriate retention periods.

COVID-19: Remote Working issues

It is not just the increased processing of health data that has raised data privacy issues. Many organisations are now asking their employees to work from home, some for the first time.

  • Security risks

Organisations are still under an obligation pursuant to Article 32 of the GDPR to ensure that the personal data processed are subject to appropriate technical and security measures. This applies in a work from home scenario as much as in the office environment.

    • Use of personal devices: Where employees have been asked to use their personal devices as part of remote working, this typically raises more issues as these will often lack the tools built in to business devices – such as strong antivirus software, customised firewalls, and automatic online backup tools. This increases the risk of malware finding its way onto devices and both personal and work-related information being compromised. Even for company-issued devices, organisations will want to consider how to manage updates where machines are not connecting to the company LAN.
    • Use of third party technologies: As organisations are embracing the use of third party technologies to adapt to this new ‘normal’, we have seen the advent of apps to replace processes and functionality that are no longer readily accessible or available to employees in a home environment – for example, videoconferencing apps, team communication apps, scanning apps etc. Questions are already being raised over the security of these apps, and the due diligence that organisations should take before permitting, or encouraging employee use, of these technologies. It may be that organisations only permit use of these technologies in limited circumstances. However, once again, given the speed of developments at the macro/governmental level, organisations are having to respond extremely quickly to a new set of security challenges.
    • BAU risks are magnified: During this time, all the more ‘traditional’ risks are likely to be magnified. Employees are working at home, possibly having shifted larger than normal amounts of confidential documents from the office to home, may also be surrounded by others – whether it be flatmates, family or partners – and so this can pose a security threat. Devices should be locked when unattended, privacy screens used where possible, and phone calls or online meetings carried out somewhere they cannot be overhead, particularly if what is being discussed is business critical or sensitive information. It may also be tempting for employees to forward emails and documents containing personal data to a personal email address if working from home and having issues with company-provided devices or the remote network. However, strictly speaking, this could often amount to a personal data breach under the GDPR as an unauthorised disclosure of personal data (albeit likely not a notifiable one, depending upon the consequences of the employee doing so). As a result, communications with employees regarding use of technologies and devices etc is more vital than ever to ensure that individuals are not inadvertently opening up the organisation to additional risk.
  • Introduction of new technologies

As we look set to be working at home for the foreseeable future, organisations may seek to introduce new technology for a host of reasons, e.g. to facilitate home-working, to monitor employees etc, which would likely involve the processing of personal data. However, as is always the case when introducing new technology that involves the processing of personal data, organisations should consider whether a data protection impact assessment is required. In the context of employee monitoring in particular, this could present issues around impact on the individual where it involves monitoring an employee at home, on a personal device, or possibly even a shared device.

COVID-19: Direct Marketing

Nothing has changed with respect to direct marketing rules and what organisations may or may not do, but just a reminder that businesses should be careful not to include marketing information in COVID-19-related communications that it is entitled to send to individuals, e.g. service communications. This could amount to a breach of the ePrivacy rules to the extent any of those individuals have opted-out of receiving direct marketing. Although the ICO has made it clear that public health messages sent by the government, NHS and healthcare professionals will not be considered to be ‘direct marketing’ for ePrivacy purposes, this should not be interpreted as meaning that all messages relating to the COVID-19 pandemic will fall outside of the ePrivacy rules.

Key points for organisations

We recommend you take the following key steps when considering data privacy risks associated with COVID-19 processing activities and remote working:

  • Ensure that measures implemented are consistent with current public health advice, to help inform what is proportionate.
  • Carry out legitimate interests assessment or data protection impact assessments if required.
  • Review employee use of unauthorised third party applications.
  • Ensure that adequate IT security is in place to take into account remote working on a large scale and for a prolonged period.
  • Update company policies on remote working if needed.
  • Remind employees to be alert to security issues and of best practices and expectations to ensure secure working from home.
  • Consider ad-hoc training for those roles that typically do not work from home.


Miriam Everett
Miriam Everett
Partner, Head of Data Protection and Privacy, London
+44 20 7466 2378
Chloe Kite
Chloe Kite
Associate, London
+44 20 7466 2540


Given the COVID-19 crisis, it is likely that data protection may no longer at the forefront of every controller’s mind, and rather, that business continuity has taken precedence. Acknowledging this shift and the need for companies to divert business as usual resources to their response to the crisis, the ICO has published two articles on its website, which are aimed at both controllers and data subjects more widely. Continue reading



  • Governments and public authorities globally are requiring increased access to personal data of citizens in order to attempt to control and monitor the current spread of COVID-19.
  • The pandemic is generally recognised by data protection authorities as giving rise to extraordinary circumstances, although in Europe at least there are still requirements for processing to be necessary and proportionate, and for personal data to be adequately protected.
  • Governmental responses around the world appear to be in some instances creating a tension between public health on the one hand and privacy on the other, highlighting a new and possibly unexpected consequence of current unprecedented times. When the crisis is over, nation state approaches to privacy may need to be reconsidered and re-evaluated.

The Pan-European Perspective: the EDPB statement

On 16 March 2020 the European Data Protection Board (“EDPB”) released a statement on measures taken to contain and mitigate COVID-19. The EDPB stated that data protection rules “do not hinder measures taken in the fight against the coronavirus pandemic”. This includes the General Data Protection Regulation (“GDPR”) which enables personal data processing without obtaining consent where it is in the public interest or the vital interests of any natural person.

The EDPB recognised that “even in these exceptional times” every controller must ensure that personal data is protected, and that the lawful processing of personal data must be guaranteed. All processing must be in the public interest and must be proportionate to the legitimate aim pursued. Further, the general principles surrounding data processing including transparency still apply, except where necessary and proportionate for reasons of national security.

Tension between public health and privacy?

Around the world, governments have attempted to control the pandemic by harnessing new technology and its power to collect and analyse increasingly large amounts of data, including personal data, which are generated on a daily basis in our societies. Whilst it is understandable that governments are seeking to use all means at their disposal in order to control the pandemic, there is a natural tension between the use of this data and the protection of personal privacy rights.  Globally, a wide range of approaches have been taken and variety of statements made which either re-affirm a commitment to data privacy, or conversely, appear to back-track on previous approaches.

Some examples of this tension currently being reported on the global stage in response to the COVID-19 crisis include:

  • The approval of emergency measures in Israel which allow the use of technology developed for counterterrorism purposes to track infected persons by monitoring their mobile phones. This monitoring technique could be used to notify people who have come into contact with infected persons or enforce quarantine orders.
  • The use of facial recognition and thermal scanning technology, in combination with passenger rules requiring people using public transport to use their real names in China. Data is shared with the police and with media outlets who report on patients’ travel history, which could include where they sat on a train or which compartment they boarded on the subway.
  • The GPS tracking app and SMS alert system used in South Korea where public authorities send text messages detailing the age, gender and recent movements of anyone recently diagnosed. This approach caused issues when speculation on the whereabouts of various infected persons broke out online.
  • The extensive powers put in place by the data protection authority in Italy until at least 30 July which allow civil protection personnel to process data including special category data and communications between employees.
  • The rules put in place by the US Department of Health and Human Services in the USA requiring airlines to collect and provide extensive data for passengers on certain flights.

Contact tracing

Contact tracing refers to the way that governments identify and monitor infected persons, which often means collecting location data. In the EU, national laws implementing the ePrivacy Directive only allow use of location data when made anonymous. Emergency legislation can be passed but only where ‘necessary, appropriate and proportionate within a democratic society’, and where adequate safeguards are put in place. In this respect, it will be interesting to monitor the responses of the various Member States’ governments to see if any such legislation is passed.

A number of countries worldwide are using location data to track and monitor anyone infected by the virus and those they have been in contact with. The USA, Iran, Singapore, Taiwan and Israel have all collected data from third parties or government-mandated apps which collect location data directly.

In addition, authorities in Germany, Ireland and Canada have indicated that they would be open to collecting and processing location data. When asked about the use of cell phone data in contact tracing investigations in Canada, the Ontario Premier commented that “everything’s on the table right now”. Ontario’s Information and Privacy Commissioner stated in response that they would not challenge such a decision as long as any measures were correlative to the outbreak, as public health officials had the power to ‘take extraordinary steps to keep the public safe’.

Limitations on data processing

Some public authorities are clear that powers to process personal data even as a response to the outbreak should not be unlimited. Data protection authorities in Ireland, France and Argentina have all released statements to the effect that public health authorities are entitled to collect and process health data without consent, but have stated that any measures taken must be necessary and proportionate and must not go beyond the management of suspected exposure to the virus. In particular, the Data Protection Commission in Ireland has made it clear that organisations must have regard to principles of transparency, security and accountability, and that only the minimum amount of data necessary should be processed.

Further, the Information Commissioner’s Office (the “ICO”) in the UK released a statement that, while data protection laws would not prevent data from being shared as a result of the pandemic and that the ICO recognises the “unprecedented challenge” of Coronavirus, any excessive or unlawful data processing will still be prohibited.  However, importantly, the ICO appears to have taken a proportionate response, accepting that companies are dealing with an unprecedented event, and that whilst they cannot extend timescales which are enshrined in law, that the ICO will use its own channels to manage data subjects expectations and will take a measured and proportionate response in terms of any investigations.  Finally, the ICO reiterates that, even though companies’ working styles are changing and more and more employees will be working from home, that companies must still have regard to their technical and organisational security measures in order to protect personal data.

A gateway to increased sharing of personal data?

Another consequence of the outbreak is that many government authorities have required private companies to share personal data originally collected for commercial purposes. Officials in Singapore have requested location data from airlines, taxi companies and ride-sharing apps such as Grab, while the USA have pressed airlines and hotels to provide extensive customer data even where Airlines for America have maintained that the requirements are “beyond the capabilities of airlines”.

What next?

It certainly seems that COVID-19 is having and will continue to have an interesting relationship with global data protection legislation and the right to privacy enshrined (often recently) in many laws around the world. Whilst it does not yet appear that data subjects are raising challenges to the governmental responses, it is important to note that many regulators, particularly in Europe, are reiterating the need to keep data protection in mind when considering responses to this pandemic and, when the dust of the current crisis has settled, the impact on data protection and privacy may be an interesting and unintended consequence of today’s unprecedented events.

Miriam Everett
Miriam Everett
Partner, Head of Data Protection and Privacy, London
+44 20 7466 2378
Lauren Hudson
Lauren Hudson
Associate, London
+44 20 7466 2483
Katie Collins
Katie Collins
Trainee Solicitor, London
+44 20 7466 2117