Revised ePrivacy Regulation Draft introduces ability for organisations to rely on “Legitimate Interests” legal basis in relation to cookies

Another revised draft ePrivacy Regulation (“ePR”) was recently published which introduces the ability for organisations to rely on the “legitimate interests” legal basis to drop cookies on end users’ devices.

This change has been criticised by some commentators for ambiguities and watering down data protection rights despite accompanying safeguards. It remains to be seen if it will be retained in future draft iterations or indeed, the agreed version of the ePR, in relation to which there is no clear timetable for implementation at present.

Background

First published in January 2017, the ePR covers specific data regulation reforms such as cookies, electronic direct marketing, over-the-top services and machine-to-machine communications. The overall approach, including a more stringent sanctions regime, would bring ePrivacy regulation into much closer alignment with the GDPR and was originally intended to coincide with the GDPR’s implementation in 2018.

Despite revised proposals from numerous Presidencies of the Council of the European Union, Member States have been unable to agree a final version of the ePR. At the moment, this means that it is unlikely to take effect before 2023 as a grace period of up to 2 years will need to elapse following adoption of the final draft.

With regards to Brexit, since the ePR is unlikely to be effective by the end of the transition period, it will not be incorporated into UK law under the withdrawal legislation (in contrast to the intended implementation of a UK GDPR). Therefore, the existing Privacy and Electronics Communications Regulations 2003 (“PECR”) will continue to apply following the end of the transition period. Once the ePR takes effect, the UK may choose to mirror the drafting or bring in its own drafting which diverges from the ePR. In any event, the ePR (in its current form) will likely still have implications for UK organisations dealing with individuals in the EU due to its intended extra-territorial scope.

The Proposed Amendments to the Draft ePrivacy Regulation

The latest draft, which simplifies the text of the core provisions and further aligns them with the GDPR, was proposed by the Croatian Presidency when it became clear that the majority of the Member States would not support the existing text.

One of the key proposals has been the introduction of the “legitimate interests” ground for introducing cookies (or similar technology) on end users’ terminal equipment represent a notable change in position from prior drafts and a step away from the consent-based model dictated by the most recent ICO cookies guidance and implemented by most organisations via cookie banners preventing users from accessing a webpage until they have set their cookie preferences accordingly. Critics have argued that this consent model is flawed as their ubiquity is leading to users ignoring them and “consent fatigue”. The introduction of the “legitimate interests” legal basis expands on previous ePR drafts’ attempts to help address this problem although the latest drafting is subject to various safeguards including fairly restrictive commentary as to when the “legitimate interests” legal basis can be relied on (e.g. not where the end user is a child, the organisation intends to use cookies to collect special categories of data or where the cookies are used to profile end users).

Commentators have criticised the drafting which seems to contain some inconsistencies. Firstly, it directly contradicts the EDPB’s statement in May 2018 that ePrivacy Regulation should not allow processing “on open-ended grounds, such as “legitimate interests” that go beyond what is necessary for the provision of an electronic communications service.” The introductory text to the draft, conversely, states that proposed safeguards mean that the new legal ground remains “in line with the GDPR”. Furthermore, tech advertisers wishing to rely on the “legitimate interests” ground may do so on condition that the end user is provided with clear information and has “accepted such use”. How an end user would confirm acceptance in practice is however unclear and this seems to cut across the prohibition on using the ground for profiling purposes.

The new proposal clearly intends to address some of the more contentious drafting points and cater to business needs (e.g. advertising). Nonetheless, given the lack of agreement to date and the ambiguities in the drafting, it remains far from certain that this draft will become the enacted version of the ePR.

Miriam Everett
Miriam Everett
Partner, Head of Data Protection and Privacy, London
+44 20 7466 2378
Duc Tran
Duc Tran
Senior Associate, Digital TMT, Sourcing and Data, London
+44 20 7466 2954

Tamsin Rankine-Fourdraine
Tamsin Rankine-Fourdraine
Trainee Solicitor, London
+44 20 7466 7508

COVID-19: How governments are using personal data to fight COVID-19

Background

The COVID-19 outbreak has resulted in an unprecedented focus on the power of data to assist in resolving national emergencies. From health tracking, to volunteer coordination, to accurately identifying the vulnerable, data is being harnessed in both the public and private sectors to try to help bring COVID-19 under control and mitigate its impact. Continue reading

European Commission Publishes GDPR Roadmap

On 1 April 2020, almost two years after the General Data Protection Regulation (GDPR) entered into force, the European Commission published a roadmap for evaluating its application.

The roadmap specifically asks for feedback on the Commission’s strategy in dealing with the issue of international transfer of personal data to third countries, focussing on existing adequacy decisions, and the cooperation and consistency mechanism between national data protection authorities.

Why has the roadmap been published?

Article 97 of the GDPR requires the Commission to submit a report on the evaluation and review of the Regulation to the European Parliament and the Council by 25 May 2020.

The GDPR specifies that the Commission must examine the application and functioning of the mechanism for (i) transfers of personal data to third countries or international organisations under Chapter V and (ii) ensuring consistency and cooperation under Chapter VII.

Further, under Article 97 the Commission is required to take into account the positions and findings of the European Parliament and the Council, and if necessary to submit appropriate proposals to amend the GDPR in light of any technological developments.

What does the roadmap say?

The Commission has announced its intention to publish a report identifying issues in the application of the GDPR as requested by Article 97. The report will build on input from the Council, the European Parliament and the European Data Protection Board, as well as two earlier reports published by the Commission in 2017 and 2019, which both considered the protection of personal data since the GDPR had entered into force.

Publication of the roadmap launches a consultation process to gather input from citizens and stakeholders. As part of that process, the Commission will be sending detailed questionnaires to data protection authorities and the European Data Protection Board, as well as to the GDPR Multi-Stakeholder Group, a group made up of business and civil society representatives. The Commission will also conduct bilateral dialogues with the authorities of the relevant third countries.

Individuals, businesses or interested parties can register with the Commission here to provide feedback by 29 April 2020. Feedback will be published to the Commission’s website alongside a synopsis report explaining how any input will be taken on board or why some suggestions cannot be actioned.

What happens next?

The Commission has already received feedback from the Council on the application of the GDPR in a report setting out its position on 19 December 2019. In its report the Council raised a number of concerns, including the challenges of determining or applying appropriate safeguards in the absence of an adequacy decision and the strain of the additional work for supervisory authorities resulting from the GDPR cooperation and consistency mechanisms.

The Council’s report also flagged issues raised by individual Member States, such as the importance of considering the application of the GDPR in the field of new technologies and big tech companies, and the development of efficient working arrangements for supervisory authorities in cross-border cases.

The deadline for providing feedback on the roadmap from citizens and stakeholders is 29 April 2020. Following consultation with the relevant parties and receipt of any online feedback, the Commission will publish its report by 25 May 2020. There is no obligation on the Commission to take any steps following publication of the report. However, Article 97(5) states that the Commission shall ‘if necessary’ submit appropriate proposals to amend the GDPR.

Miriam Everett
Miriam Everett
Partner, Head of Data Protection and Privacy, London
+44 20 7466 2378
Katie Collins
Katie Collins
Trainee Solicitor, London
+44 20 7466 2117

COVID-19 People: Data comes to the fore as outbreak continues (UK)

The COVID-19 outbreak is proving an interesting time to be a data protection practitioner. There seems to be a new article each day about the next exciting app which promises to use data to help manage the crisis.

This post focuses on two particular propositions that pose interesting data protection considerations. It also flags the wider issues that developers should bear in mind when trying to respond to this unprecedented crisis.

Contact Tracing

It was reported on 31 March 2020 that the UK government is actively set to develop some form of contact tracing app in the near future. This follows successful app-based contact tracing in Singapore and South Korea. Led by NHSX, the innovation arm of the NHS, the app will leverage Bluetooth to identify individuals who have been in close proximity to each other, storing a record of that contact, and providing a mechanism through which an individual can be notified if they have been in close proximity to someone that tested positive for COVID-19. Given the anticipated use of Bluetooth, it is possible that NHSX may leverage Singapore’s TraceTogether app which used the same technology, the code for which was open-sourced by the Singapore government last week. TraceTogether was widely praised for collecting the bare minimum of data despite the extraordinary circumstances at hand.

The success of any tracing app will depend on a critical mass of users downloading it. Concerns are already being raised about whether private entities might require either employees or customers to use the app, to show they have not been in contact with infected individuals. It will also depend on a comprehensive testing regime to ensure that those who are symptomatic are tested quickly so that the notification can be sent appropriately quickly. Similarly, swift testing may help avoid people being unduly required to quarantine themselves having been in contact with someone with minor symptoms which do not turn out to be COVID-19.

It is interesting to note that initial statements from NHSX suggest that contacts will be stored on users’ phones, with notifications sent via the app after a suitable delay to avoid identification of the infected individual. It is not currently intended that the data would be sent regularly to a central authority, which may give comfort to people concerned about their privacy. Additionally, NHSX has indicated that it intends to appoint an ethics board to oversee this project.

COVID Symptom Tracker

ZOE, a health and data science company, in conjunction with Tim Spector, a genetic epidemiology professor at Kings College London, have created an app called ‘COVID Symptom Tracker’ that allows users to self-report potential symptoms of COVID-19, even if feeling well. The aim is to use this data to track the progression of the virus in the UK, and potentially identify high risk areas.

At the time of writing the app has been downloaded over 1.5 million times and is listed in Apple’s top 10 free apps in the App Store. The app requires individuals to provide data including age, sex at birth, height, weight, postcode of residence, pre-existing health conditions, and habits such as smoking. Each day, users then report how they are feeling against a list of known symptoms. It appears from the app’s privacy policy that unanonymised personal data may be shared with the NHS or King’s College London, whilst data shared with other entities is given an anonymous identifier.

The app is based on consent, both to the data processing and to potential transfers of personal data to the US. Data is collected for the following purposes related to COVID-19 including: (i) better understanding the symptoms; (ii) tracking the spread of the virus; (iii) advancing scientific research into links between patient health and their response to infection with the virus; and (iv) potentially to help the NHS support sick individuals. Whilst at an initial glance this seems like a reasonably narrow set of processing purposes, you could envisage a surprisingly broad range of activities which might fall within these categories, including specifically tracking individuals.

Data protection considerations

When it comes to processing personal data, the post-GDPR mantra is increasingly ‘Just because you can, doesn’t mean you should’. The principles of fairness, transparency, purpose limitation and data minimisation in particular will require serious consideration to ensure that the proposed data usage is justifiable.

Whilst the Secretary of State for Health & Social Care Matt Hancock recently tweeted that “the GDPR does not inhibit use of data for coronavirus response”, this may not necessarily be aligned with the ICO position that the GDPR is still in full force, despite the fact that the ICO may take a pragmatic approach where necessary. There are certainly lawful routes to using personal data to fight COVID-19, but this should be done based on clear reasoning and analysis.

With that in mind, the following key considerations may assist when evaluating whether or not to use personal data in the context of COVID-19:

  • be confident that you have an appropriate lawful basis for processing the personal data. Remember that both vital interests and substantial public interest are very high bars to satisfy. Likewise, legitimate interests always needs to be balanced against any potential impact on individuals’ rights and freedoms;
  • do not use personal data for extraneous purposes. You should aim to keep your processing purposes as narrow as possible for the stated aims, and be conscious that any attempt to use the dataset for non COVID-19 related reasons might be seen as acting in bad faith. Similarly, the collected data should be limited to what is strictly necessary for the processing purposes. Avoid the temptation to collect additional categories of personal data because they ‘may’ be useful in future;
  • the potential volume of data processing, and categories of personal data being anticipated, suggest that in relation to many of the COVID-19 related apps a data privacy impact assessment should be undertaken. These should be completed carefully and not rushed for the sake of getting an app into the live environment;
  • consider who personal data is shared with, and whether sharing a full dataset is strictly necessary. It may be possible to anonymise personal data such that the recipient only receives fully anonymised data, which may help manage data subject concerns about where their personal data might go. Remember however that true anonymisation is difficult and the pseudonymisation alone does not take data outside of the scope of the GDPR;
  • given the potentially higher risk processing that is taking place, it is important that data subjects understand how their personal data will be used, and who it may be shared with, particularly where they are giving up unusual freedoms such as in the context of tracking. Data controllers should aim to go above and beyond to ensure their fair processing information is clear and easy to understand, so that individuals have good expectations of how their data will be used;
  • if and when relying on data subject consent for any processing, it is likewise important to ensure that the individuals understand exactly what they are consenting to. Now more than ever it is vital that consent is specific, freely given, informed and explicit when dealing with sensitive health data;
  • personal data collected in the context of COVID-19 is generally required for the specific aim of managing the outbreak of the virus or its effects. This may mean that it is not necessary or appropriate to retain this personal data once the virus has been controlled and life returns to normal, depending on what has been communicated to data subjects; and
  • holding larger volumes of personal data, or special category data, potentially represents a higher security risk and there may be increased cyber attacks on the dataset. Ensure that you have appropriate additional security measures in place where necessary.
Miriam Everett
Miriam Everett
Partner, Head of Data Protection and Privacy, London
+44 20 7466 2378
Hannah Brown
Hannah Brown
Associate, Digital TMT, Sourcing and Data, London
+44 20 7466 2677

COVID-19 PEOPLE: DATA PRIVACY ISSUES

In these unprecedented times, COVID-19 has forced organisations to quickly put in to place measures with the aim of ensuring both business continuity and the protection of employees. In many instances, this has involved increased processing of health data, in ways that were not envisaged a short time ago. Organisations across the globe are also asking employees to work from home. Given the timeframes involved and speed at which government advice and directions have evolved, data protection regulators are recognising the challenges involved (please see the related article here), yet a global pandemic is not a general waiver for privacy compliance.

Here we explore some of the data privacy issues that organisations should be considering as they adapt to the COVID-19 crisis. For more information about general people issues, please see COVID-19: People – key issues for UK employers.

COVID-19 related data processing: key compliance issues

  • Lawful basis for processing for COVID-19 related activities

For all COVID-19 related activities involving the processing of health data of, whether it be as a result of: (a) employees voluntarily informing employers that they have tested positive for, or are suspected to have, COVID-19; (b) employers proactively asking employees about their health; or (c) other preventative measures introduced by employers (e.g. body temperature scanning for access on to premises), a lawful basis for processing is required under both Article 6 and Article 9 of the GDPR.

Article 6: The Article 6 ground which many organisations are likely to seek to rely on will be the “legitimate interests” of the organisation or third parties (e.g. other employees), provided that a risk assessment is carried out to check that any risks to individuals’ interests are proportionate. This should be documented in a legitimate interests assessment. It is, however, recognised that organisations are being required to respond rapidly to evolving guidance and it may not always be feasible to carry out such an assessment. Alternatively, an organisation may seek to rely on other lawful bases, such as:

    • the processing is “necessary to perform the employment contract”, if ensuring health and safety is a term of that agreement; or
    • the processing is “necessary to comply with legal obligations”, in relation to health and safety.

Article 9: As health data is considered ‘special category data’ under the GDPR, a lawful basis will also be required under Article 9 of the GDPR. It is likely that much of the processing will be necessary to carry out obligations in relation to employment law, insofar as it is authorised by Union or Member State law (Article 9(2)(b)). Other relevant grounds may also be “public health” and “preventative and occupational medicine”, again in each case insofar as authorised by Union or Member State law (Articles 9(2)(h) and (i)). As you will note, this aspect of the GDPR is devolved to Member States, meaning that local privacy and employment laws will need to be reviewed to assess what specific measures may be permitted locally when processing health data.

In respect of the UK, the UK Data Protection Act 2018 provides for these conditions at Schedule 1, Part 1, but imposes additional safeguards. For example, if relying on the basis that processing is necessary to carry out obligations in relation to employment law, the organisation must have an “appropriate policy document” in place, which should:

    • explain the organisation’s procedures for securing compliance with the principles set out in Article 5 of the GDPR; and
    • explain the organisation’s policies as regards retention and erasure of personal data, giving an indication of how long such personal data is likely to be retained.
  • Disclosing COVID-19 employee-related information

Where an employee has tested positive for COVID-19, an employer may wish to carry out ‘contact tracing’ amongst other employees, or alert other employees. However, unless it has the explicit and freely given consent of the employee who has tested positive, it should not be divulging the name of that employee to anyone else, although employers can still communicate that employees may have been exposed. The Information Commissioner’s Office (ICO) has indicated that employers that inadvertently share too much information in a bid to protect employees’ health will not be penalised, although the more cautious approach would not be to test this and to avoid disclosing the names of affected employees.

  • Proportionality and other considerations

The personal data that is processed should be limited to only what is necessary for the purpose of the response measure the organisation is implementing and making decisions as to action required. All other relevant GDPR principles and obligations will also need to be kept in mind and complied with – for example, data minimisation, the updating of Article 30 records, and appropriate retention periods.

COVID-19: Remote Working issues

It is not just the increased processing of health data that has raised data privacy issues. Many organisations are now asking their employees to work from home, some for the first time.

  • Security risks

Organisations are still under an obligation pursuant to Article 32 of the GDPR to ensure that the personal data processed are subject to appropriate technical and security measures. This applies in a work from home scenario as much as in the office environment.

    • Use of personal devices: Where employees have been asked to use their personal devices as part of remote working, this typically raises more issues as these will often lack the tools built in to business devices – such as strong antivirus software, customised firewalls, and automatic online backup tools. This increases the risk of malware finding its way onto devices and both personal and work-related information being compromised. Even for company-issued devices, organisations will want to consider how to manage updates where machines are not connecting to the company LAN.
    • Use of third party technologies: As organisations are embracing the use of third party technologies to adapt to this new ‘normal’, we have seen the advent of apps to replace processes and functionality that are no longer readily accessible or available to employees in a home environment – for example, videoconferencing apps, team communication apps, scanning apps etc. Questions are already being raised over the security of these apps, and the due diligence that organisations should take before permitting, or encouraging employee use, of these technologies. It may be that organisations only permit use of these technologies in limited circumstances. However, once again, given the speed of developments at the macro/governmental level, organisations are having to respond extremely quickly to a new set of security challenges.
    • BAU risks are magnified: During this time, all the more ‘traditional’ risks are likely to be magnified. Employees are working at home, possibly having shifted larger than normal amounts of confidential documents from the office to home, may also be surrounded by others – whether it be flatmates, family or partners – and so this can pose a security threat. Devices should be locked when unattended, privacy screens used where possible, and phone calls or online meetings carried out somewhere they cannot be overhead, particularly if what is being discussed is business critical or sensitive information. It may also be tempting for employees to forward emails and documents containing personal data to a personal email address if working from home and having issues with company-provided devices or the remote network. However, strictly speaking, this could often amount to a personal data breach under the GDPR as an unauthorised disclosure of personal data (albeit likely not a notifiable one, depending upon the consequences of the employee doing so). As a result, communications with employees regarding use of technologies and devices etc is more vital than ever to ensure that individuals are not inadvertently opening up the organisation to additional risk.
  • Introduction of new technologies

As we look set to be working at home for the foreseeable future, organisations may seek to introduce new technology for a host of reasons, e.g. to facilitate home-working, to monitor employees etc, which would likely involve the processing of personal data. However, as is always the case when introducing new technology that involves the processing of personal data, organisations should consider whether a data protection impact assessment is required. In the context of employee monitoring in particular, this could present issues around impact on the individual where it involves monitoring an employee at home, on a personal device, or possibly even a shared device.

COVID-19: Direct Marketing

Nothing has changed with respect to direct marketing rules and what organisations may or may not do, but just a reminder that businesses should be careful not to include marketing information in COVID-19-related communications that it is entitled to send to individuals, e.g. service communications. This could amount to a breach of the ePrivacy rules to the extent any of those individuals have opted-out of receiving direct marketing. Although the ICO has made it clear that public health messages sent by the government, NHS and healthcare professionals will not be considered to be ‘direct marketing’ for ePrivacy purposes, this should not be interpreted as meaning that all messages relating to the COVID-19 pandemic will fall outside of the ePrivacy rules.

Key points for organisations

We recommend you take the following key steps when considering data privacy risks associated with COVID-19 processing activities and remote working:

  • Ensure that measures implemented are consistent with current public health advice, to help inform what is proportionate.
  • Carry out legitimate interests assessment or data protection impact assessments if required.
  • Review employee use of unauthorised third party applications.
  • Ensure that adequate IT security is in place to take into account remote working on a large scale and for a prolonged period.
  • Update company policies on remote working if needed.
  • Remind employees to be alert to security issues and of best practices and expectations to ensure secure working from home.
  • Consider ad-hoc training for those roles that typically do not work from home.

 

Miriam Everett
Miriam Everett
Partner, Head of Data Protection and Privacy, London
+44 20 7466 2378
Chloe Kite
Chloe Kite
Associate, London
+44 20 7466 2540

It’s too soon to change the GDPR says EDPB

Summary

  • The EDPB has reviewed implementation of the GDPR so far and has declared the first year and a half a success.
  • The EDPB did note areas for improvement, including the impact of implementation on SMEs and issues with cooperation across different jurisdictions.
  • However, notwithstanding these difficulties, the EDPB considers that it would be premature to revise the text of the GDPR.

The report

On 18th February 2020 the European Data Protection Board (“EDPB”) adopted its contribution to the evaluation of the GDPR under Article 97. The report is the EDPB’s reflection on the GDPR so far, noting areas of success and those where there is room for improvement.

Overall, the EDPB sees the first year and a half of the GDPR as a success, which has ‘strengthened data protection as a fundamental right and harmonized the interpretation of data protection principles’. In particular, the EDPB has emphasised that the GDPR is ‘a technologically neutral framework’ and is designed ‘to foster innovation by being able to adapt to different situation’.

Despite its generally positive outlook, the EDPB did acknowledge that it has not all been plain sailing, and the implementation of the GDPR has been challenging, in particular for SMEs. The EDPB has emphasised its commitment to developing tools to try and make compliance less burdensome for SMEs.

Similarly, the difficulty of implementing the cooperation and consistency mechanisms in the GDPR was also noted. The EDPB now publicly accepts that the ‘patchwork of national procedures and practices has an impact on the cooperation mechanism’ and notes that it is examining potential solutions to ensure the GDPR is applied consistently.

The report also touches on international data transfers and the resourcing challenges faced by supervisory authorities.

The EDPB concludes that it would be too soon to revise the text of the GDPR, and instead invites legislators to focus on adopting the ePrivacy Regulations to complete the data protection framework, a task that has thus far proven to be challenging.

Analysis

The EDPB, unsurprisingly, has a taken a positive view of GDPR so far. It is certainly true that data protection has become a board level issue within most organisations, and data subjects are now more aware than ever of their rights. That being said, it is perhaps optimistic to suggest that the GDPR fosters innovation, given the difficulties that some emerging technologies such as blockchain have found in aligning themselves to the GDPR requirements, and the further issues that the market anticipates as innovation accelerates away from existing regulation.

SMEs will welcome the prospect of additional support for their compliance programmes. We will watch this space to see what solutions are proposed, and whether they actually help in practice.

The focus on cooperation and consistency will be no surprise to anyone that has struggled with the realities of implementing a single data protection policy across Europe. The commitment to finding a solution for consistent GDPR application will be a welcome statement for those companies who have grappled with local divergences, either by accepting certain jurisdictions as outliers from their overall data protection regime, or by having to take a risk based approach where they have chosen not to follow local derogations. However, it will be interesting to see whether any further harmonisation acts a race to the bottom to meet the lowest acceptable standard, or alternatively whether it requires more lax jurisdictions to take a more rigorous approach to enforcing the GDPR.

Miriam Everett
Miriam Everett
Partner, Head of Data Protection and Privacy, London
+44 20 7466 2378
Hannah Brown
Hannah Brown
Associate, London
+44 20 7466 2677

Could the UK move away from GDPR to help foster innovation?

Publicly the UK government is pursuing an adequacy decision from Europe regarding data protection and privacy regulations, but recent comments from ‘Number 10’ could be interpreted as saying the UK may be comfortable pursuing a different privacy path to the EU from the end of December 2020. The issue is about more than making changes to laws which were six years in the making. It has the potential to influence the way the UK develops a number of industries, particularly in the world of artificial intelligence. Why? Simply because the opportunity to prioritise innovation may be a good move to drive British business, but it could challenge the UK’s longstanding approach to protecting the digital privacy of its citizens.

It’s an issue our global head of data and privacy, Miriam Everett, raised with Finextra. You can read what she had to say here.

For more information, or if you have any queries, please contact Miriam Everett.

Miriam Everett
Miriam Everett
Partner, Head of Data Protection and Privacy, London
+44 20 7466 2378

UK Maintains Adequacy Status in Japan Post-Brexit

Summary

  • UK will maintain its adequacy status in Japan even after it withdraws from the European Union.
  • Japan recognises that the UK has relevant legislation in place to maintain its adequacy assessment.

The Personal Information Protection Commission (“PPC”) in Japan has announced that, with respect to the transfer of personal data between Japan and the UK, the UK will maintain its adequacy status even after it withdraws from the European Union (“EU”).

Background

The UK withdrew from the EU on 31 January 2020 and has entered into a transition period until 31 December 2020, during which time it will remain subject to EU rules including the General Data Protection Regulation (“GDPR”).

Currently, European Economic Area member states, which includes those member states within the EU but does not include the UK, are included in Japan’s white list of countries which Japan recognises as having an adequate level of personal data protection. This recognition enables personal data to be transferred out of Japan and into white-listed countries without the requirement for any further safeguards to be in place.

The PPC’s Announcement

The PPC’s announcement on 28 January 2020 confirms that the UK will continue to maintain its adequacy status in Japan now that it has withdrawn from the EU because it has the relevant legislation in place to maintain its adequacy assessment. The PPC also confirms that this will apply to the UK even after the transition period.

This is a welcome indication that countries outside of the EU recognise the ability of the UK’s data protection laws to enforce international data protection requirements and that cross-border data transfer with the UK can continue after the transition period.

This announcement follows the recent adoption by the European Commission of its adequacy decision in favour of Japan on 23 January 2020.

As we noted in our 2020 data protection predictions blog, we expect the discussions around the UK’s adequacy decision to be one of the key developments in the year to come for data protection. Despite the GDPR being enacted into UK law, it remains to be seen whether the EU will recognise the UK as providing adequate levels of data protection following the transition period. In this regard, the European Data Protection Supervisor (“EDPS”), Wojciech Wiewiórowski, noted that the UK is “13th in the row” for an adequacy decision. Even though the EDPS does not participate directly in adequacy decisions, his comments may indicate a general reluctance to let the UK skip the queue in terms of an adequacy decision.

 

Miriam Everett
Miriam Everett
Partner, Head of Data Protection and Privacy, London
+44 20 7466 2378
Angela Chow
Angela Chow
Associate, London
+44 20 7466 2853

International Data Privacy Day: Our predictions for 2020

What better day than today, International Data Privacy Day, to explore what 2020 is likely to have in store for data and privacy? Almost two years ago the EU General Data Protection Regulation (GDPR) thrust data and privacy issues firmly in the spotlight, where they remain. With attention having shifted from guidance to enforcement, this article sets out some predictions for further developments in the year to come.

  • Data ethics: The discussion is moving from “what can we do” to “what should we do” with data. Organisations are coming under increased pressure, not just from consumers who are now demanding greater transparency around how their data is collected, used and handled, but also other stakeholders such as government, regulators, industry bodies and shareholders. 2020 is likely to be the year in which we will see an increased focus in the boardroom on how to incorporate ‘ethical practices’ into data strategies, to leverage consumer trust and drive long-term profitability.
  • GDPR fines: In 2020 we expect to see the final enforcement notices for the British Airways and Marriott data breaches issued by the UK’s data protection authority, the Information Commissioner’s Office (ICO). These had originally been expected in early January, but an extension was agreed and final enforcement notices are now expected in March 2020 to finalise the penalties imposed on both organisations, both which were the result of high-profile data breaches and subsequent ICO investigations.
  • GDPR enforcement activity: Is 2020 also the year in which we see other big data breaches, investigations and fines? 2020 will also likely see a shift in enforcement activity – going beyond data breaches to other areas of non-compliance with the GDPR. For example, the Berlin data protection authority issued a €14.5 million fine on a real estate company for the over retention of personal data. Elsewhere in Europe, 2020 should be the year when we see the results of the Irish Data Protection Commissioner’s investigations into some of the biggest tech companies, including WhatsApp and Twitter.
  • Adtech focus:We also expect the GDPR to start becoming real for the adtech sector in 2020. In June 2019, the ICO released its Adtech Update Report, with a clear message to the real-time bidding industry that they had six months to act; the ICO expressed significant concerns about the lawfulness of the processing of special category data and the lack of explicit consent for that processing. That six-month period is now up, and while – to the dismay of privacy advocates – the ICO has announced that the proposals of the leaders of the industry, the Internet Advertising Bureau (IAB) and Google, will result in real improvements to the handling of personal data, in the same statement, it has stated that “[t]hose who have ignored the window of opportunity to engage and transform must now prepare for the ICO to utilise its wider powers.” So, will 2020 be the year in which we see meaningful enforcement action from the ICO in this area?
  • Adequacy decision for the UK: Yes, a Brexit-related prediction had to feature somewhere on this list. At the time of writing, it looks set that the United Kingdom will leave the European Union on 31 January 2020, with an 11-month transition period in place. The pertinent question now is what will Brexit look like at the end of this transition period, and in particular with respect to how international data transfers will be treated. It may be that 2020 is the year in which the European Commission makes an adequacy decision in favour of the United Kingdom, but concerns remain over the processing of personal data for law enforcement purposes in the UK – and the EU’s data protection supervisor has essentially said that the United Kingdom is at the back of the queue for any such decision. So, will 2020 be the year of a United Kingdom adequacy decision, or will it be the year in which organisations undertake a review of their UK data transfer flow agreements in a scramble to be compliant?
  • Lead supervisory authority no more: From 31 January 2020, the ICO will no longer be a supervisory authority for GDPR purposes and will not participate in the one stop shop mechanism or the consistency and cooperation procedure. The ICO will also lose its power to be the lead supervisory authority for approving binding corporate rules. It is possible that any future deal may change that position, but in the meantime multinational organisations whose activities are caught by the GDPR should ensure that they have an appropriate lead supervisory authority based in an EU Member State.
  • Schrems II and the SCCs: While in the case of Schrems II, the Advocate General (AG) of the Court of Justice of the European Union (CJEU) issued an opinion that upheld the validity of the European Commission standard contractual clauses (SCCs), the AG also raised concerns about the practical use of the SCCs in jurisdictions where national security laws would breach the SCCs, and suggests moving the responsibility for using the SCCs away from the data importer to the individual company exporting data. If the CJEU follows this opinion, which is expected in the first quarter of 2020, it could result in substantial additional burdens before using SCCs. It could also have ramifications for the United Kingdom after Brexit.
  • Fall of the US Privacy Shield: In Schrems II, the AG opinion also expressed concerns over the EU/US Privacy Shield. If the CJEU follows the AG’s opinion then it could influence the case of La Quadrature du Net v Commission – a case concerning the French advocacy group, La Quadrature du Net, which is seeking to invalidate the Privacy Shield on the basis that it fails to uphold fundamental EU rights because of US government mass surveillance practices. Will 2020 be the year we see the Privacy Shield suffer the same fate as its predecessor, the Safe Harbour?
  • Artificial Intelligence regulation: The European Commission’s incoming president, Ursula von der Leyen, has stated that she will put forward legislation to regulate the use of artificial intelligence and only this month a draft Commission white paper was leaked, which floated a number of options on how to achieve this. This ranged from imposing mandatory risk-based requirements on developers, to sector-specific requirements, to voluntary labelling. Although it would not be a reality for a number of years, 2020 looks likely to be the year that we see a firmer picture emerge about the direction that the European Commission wishes to take AI regulation.
  • Data class actions: In November 2019, the Supreme Court heard Morrisons’ appeal of the finding that it was vicariously liable under the Data Protection Act 1998 for a data breach committed by a disgruntled employee, even though Morrisons themselves were data protection compliant. While this case involves the law as it stood before the GDPR, given the increase in the rights of data subjects under the GDPR, should the Supreme Court decision find in favour of the claimants, this could open the door in 2020 to a wave of class actions from employees, customers, and others whose personal data has been compromised in a data breach.
  • Data-focused commercial disputes: And it is not just collective actions from data subjects that may rise – in 2020 we could also see increased data protection-focused litigation and commercial disputes in the business to business sphere, as the spotlight continues to remain on data. For example, disputes over the allocation of liability where a controller has been fined and is seeking to claim this back from a third party processor. Which leads us on to…
  • Third party risks: Focus in 2020 will also be firmly directed at third party risk management and demands on suppliers and vendors to demonstrate compliance. Gartner research reveals that “compliance programs are focused on third-party risk more than ever before, with more than twice the number of compliance leaders considering it a top risk in 2019 than three years ago.” As the nature of third party relationships continues to evolve, and the amount of data that third parties host and process for organisations on the rise, processes and procedures also need to evolve to address this risk.
  • Data is a global issue: In the wake of the GDPR and the California Consumer Privacy Act, we are seeing a global trend in other jurisdictions to introducing or seeking to introduce more robust data protection laws. For example, 2020 will see both the Brazilian General Data Protection Law (which is largely based on the GDPR) and Thailand’s Personal Data Protection Act come into force. Other data protection legislation initiatives are also going through approval stages – for example, the New Zealand Privacy Bill and India’s first major data protection bill.
  • ePrivacy: But will 2020 be the year that finally sees agreement on the new ePrivacy proposals in Europe? The update to the European legislation which regulates cookies and electronic marketing has been plagued by delays and disagreements. Even if 2020 is the year that ePrivacy is finally agreed in Europe, considerations will then move to the UK’s own approach to ePrivacy in a post-Brexit world.

For more information, or if you have any queries, please contact Miriam Everett.

Miriam Everett
Miriam Everett
Partner, Head of Data Protection and Privacy, London
+44 20 7466 2378
Chloe Kite
Chloe Kite
Associate, London
+44 20 7466 2540

Protecting your company’s critical resource: Is your company PDPA ready?

Data has been labeled the world’s most valuable resource in our current digital economy.  It is the lifeblood of many companies, especially those in the technology, media and telecommunications sector where data is often ­used to predict, analyse and respond to consumers’ behaviours, patterns and preferences for services and products.  Capabilities to collect and analyse mass data are therefore seen as a decisive factor used to distinguish whether one company is a cut above the rest, using data to accurately determine current and future market trends.  But in a regulated society, companies cannot freely process whatever data they choose – a balance must be struck between technological innovation and protection of individuals’ rights attaching to their personal data. Continue reading