- The UK privacy regulator has admitted that its own cookie consent process does not comply with the current GDPR and ePrivacy rules.
- According to the regulator, a new process will be implemented during the week beginning 24th June 2019, which could give organisations a valuable insight into how to navigate the complex interaction between the GDPR and ePrivacy rules in a compliant manner.
- The regulator has also promised detailed guidance on cookies “soon“.
The Privacy and Data Protection Journal has published an article by Duc Tran, Senior Associate from our Digital TMT, Sourcing & Data Team, exploring automated decision making under the General Data Protection Regulation (GDPR).
In recent times, forward-thinking organisations have sought to automate and optimise the effectiveness and efficiency of their operations and decision making processes using new and disruptive technologies such as AI and machine learning. However, whilst the efficiency gains and other benefits may be considerable, it is important for these organisations to be aware of the legal implications of using such technology.
One of these considerations is the restriction on the use of machines and automated systems to make decisions about individuals. Continue reading
- The long-running challenge to the so-called EU Standard Contractual Clauses and the EU-US Privacy Shield, both used to lawfully transfer personal data outside of Europe, is now going to be heard by the European Court of Justice (“ECJ“) after an attempt to block the referral was rejected by the Irish Supreme Court.
- The ECJ will now assess and opine on whether these methods of international data transfer satisfy the requirements of the GDPR, with the potential for either or both mechanisms to be struck down like the US Safe Harbor was in 2015.
- If the court finds either method to be invalid, it would have a major impact on the cross border transfer of personal data, leaving companies with significant GDPR compliance issues and extremely limited options to be able to lawfully transfer data across national boundaries.
The GDPR came into effect almost a year ago on the 25 May 2018. As the most significant reform of data protection law in Europe for over 20 years, the legislation raised expectations of a cultural shift in attitude to data privacy. A year on from the fanfare of implementation, this bulletin looks at key aspects of what we have seen and learnt since implementation, and what we can expect for the future.
Although we are still waiting for a ‘GDPR mega fine’, we have seen a EUR 50 million fine levied by the CNIL in France and there have also been some interesting enforcement decisions coming out of Europe in the first 12 months. There have been rumours of a fine matrix being developed by the regulators to help assess the level of fine to be imposed but, for now at least, it remains unclear how fines are calculated and when a ‘mega fine’ may be appropriate.
Interesting enforcement action to note so far includes:
UK: ICO finds HMRC to be in “significant” breach of data protection legislation but does not impose a fine
In May 2019, the ICO found HMRC in the UK to be in “significant” breach of the GDPR by processing special category biometric data (voice recognition data) without a lawful basis. However, instead of imposing a monetary penalty, the ICO issued an enforcement notice requiring HMRC to delete the relevant data by early June 2019. For more information on this enforcement action, see our blog post here.
Belgium: Court of Appeal asks CJEU for GDPR guidance on the ‘one stop shop’
In May 2019, the Belgian Court of Appeal asked the European Court of Justice for help interpreting the application of the GDPR’s ‘one stop shop’ and whether the designation by companies of a lead supervisory authority in Europe precludes any other European supervisory authority from taking enforcement action against that company. The results of the case will either open or close the doors for regulators across Europe to cast aside the one stop shop when looking to enforce GDPR compliance in their home jurisdiction. For more information on this enforcement action, see our blog post here.
Poland: When is it a disproportionate effort to provide a privacy notice?
In April 2019, the Personal Data Protection Office in Poland issued a €220,000 fine to a digital marketing company for breaching its obligations under Article 14 of the GDPR (i.e. to provide a privacy notice to individuals). The decision has some important practical implications for organisations, including that: (i) the collection of publicly-available information from the internet does not relieve you of your obligations under the GDPR; (ii) a significant cost (in this case €8 million) involved with providing privacy notices to individuals is not sufficient to be able to rely on the ‘disproportionate effort’ exemption under Article 14; and (iii) the GDPR is not prescriptive about how individuals must be provided with privacy information but the ‘passive’ posting of a notice on a website is unlikely to be sufficient where the individuals are unaware of the collection of their data. For more information on this enforcement action, see our blog post here.
Germany: German competition regulator takes enforcement action against Facebook for data issues
In a slight move away from privacy regulation, the German competition authority, the Federal Cartel Office, announced the results of its investigation into Facebook in February 2019. The decision highlights the ever increasing tension between competition and privacy regulation. The FCO found that Facebook had a dominant position in the German market for social networks, and abused this with its data collection policy. The FCO did not impose a fine on Facebook, but has instead required Facebook in the future to only use data from non-Facebook sources where it has users’ voluntary consent, the withholding of which cannot be used to deny access to Facebook. For more information on this enforcement action, please see our blog post here.
UK: First extra-territorial enforcement action commenced by the ICO
In October 2018, the UK data protection regulator, the ICO, issued its first enforcement notice under the GDPR. The notice was particularly noteworthy because it was issued against a company located in Canada, which does not have any presence within the EU. Despite the breaches being alleged, the enforcement notice was the first issued by the ICO relying on the extra-territorial provisions of the GDPR under Article 3. For more information on this enforcement action, please see our blog post here.
For many companies, a frustrating aspect of GDPR compliance over the last year has been the uncertainty. One year on from GDPR implementation and many questions remain unanswered. But we have now started to see signs that fundamental questions may eventually be answered and new regulatory guidance is starting to drip feed through the process.
Interesting regulatory guidance published over the last year includes:
A global regulation? EDPB guidelines on GDPR’s extra-territoriality provisions
The expansive nature of the GDPR’s extra-territoriality provisions has resulted in many organisations outside of Europe questioning whether or not they are subject to the GDPR regime. The market has eagerly awaited any guidance in respect of how Article 3 of the GDPR should be interpreted, and so the draft EDPB guidance published late last year was welcomed by the data community and the market as whole. However, whilst the draft guidance answered certain questions about the application of the GDPR, it also left a number of gaps and so we are still awaiting the final version of the guidance in the hope that some of those gaps will be closed. For more information on this guidance, see our blog post here.
EDPB guidance on when processing is “necessary for the performance of a contract”
In April 2019, the EDPB published guidance on the ability of online service providers to rely on the fact that processing is necessary for the performance of a contract in order to legitimise their processing of personal data. Although aimed specifically at online services, the guidance will nonetheless be useful for all controller organisations looking to rely on this processing condition. The guidance adopts a fairly narrow approach to interpretation with an objective assessment of “necessity” being required as opposed to relying on what is permitted under or required by the terms of a contract. For more information on this guidance, please see our blog post here.
EDPB opinion on the interplay between GDPR and ePrivacy
With companies having completed their GDPR compliance programmes, thoughts are now turning to the next major piece of European regulation in the data privacy sphere, the proposed ePrivacy Regulation, and how ePrivacy interacts with the GDPR, particularly with respect to cookie consent and email marketing. In March 2019, the EDPB published an opinion on the interplay between GDPR and ePrivacy which, whilst interesting, also confirmed that the whole ePrivacy regime is currently being renegotiated at a European level and the new ePrivacy Regulation could further change the position outlined in the opinion. As such, the opinion itself appears to be of minimal use for companies. For more information on this guidance, please see our blog post here.
What’s still to come?
One year on from GDPR implementation and we’ve seen limited enforcement action and even less regulatory guidance, meaning that companies are still having to try and find their way through compliance without direction. Much remains unknown and unanswered but what can we expect (or hope) from the next 12 months?
The Brexit issue rumbles on without much/any clarity or certainty. We know that an adequacy decision for the UK is extremely unlikely in the short term but whether or not an interim transition deal is achievable (including with respect to data protection and data transfers) remains unknown at this stage.
Although the results of the EU-US Privacy Shield annual review in 2018 seem to confirm that the Privacy Shield remains intact for the short term, there remain significant uncertainties around the future of other compliant international data transfer mechanisms. In particular, the validity of the so-called Standard Contractual Clauses (“SCCs”) continues to be challenged through the courts which could result in the SCCs being struck down by the CJEU in the same way that the US Safe Harbor was in 2015.
Continuing on the theme of international transfers, we are also still awaiting the publication of updated versions of the SCCs. The current versions still refer to the 1995 Directive instead of the GDPR but cannot be amended for sense without the risk of invalidating them. There are rumours that the EU Commission has started to consider an update, including potentially updating the controller to processor SCCs to include Article 28 obligations. However, we have yet to see anything concrete coming out of Europe.
As mentioned above, the ePrivacy Directive is currently being renegotiated and was originally intended to be ready in time for the GDPR implementation. However, the failure of the European institutions to agree on a number of issues has resulted in multiple delays and it now does not look likely that a draft will be agreed before the end of 2019/early 2020, meaning that the situation regarding cookie consent and email marketing is likely to remain uncertain for a considerable period of time.
As noted above, we are still awaiting a GDPR ‘mega fine’ but we also haven’t yet seen much in the way of significant volumes of enforcement action in order to be able to gain any meaningful insights into enforcement. There are rumours of significant enforcement actions in the pipeline from the ICO and the Irish Data Protection Commissioner, and we also know that there have been a number of material personal data breaches since implementation of the GDPR, but we will have to wait and see what happens in year two of GDPR.
Individual rights and data disputes
Although the GDPR provided for enhanced data subject rights for individuals, we have also started to see it being used innovatively as a mechanism by individuals to assert other rights, including human rights and the right to privacy. We have seen Prince Harry assert that a news company’s photograph of him at home was in breach of GDPR, and a claim against the Police for their use of facial recognition technology has recently started in Wales. Going forward, we are therefore likely to see GDPR used as a tool in disputes. For more information about this, please see our blog post here.
Data breach compensation
Perhaps the elephant in the room sits with data breach compensation. In April 2019 the Supreme Court granted Morissons permission to appeal against the Court of Appeal ruling that it was vicariously liable for its employee’s misuse of data, in the first successful UK class action for a data breach. Whilst the date for the Supreme Court’s hearing is still to be confirmed, the appeal is likely to take place during the course of 2020. For more information on the case, please see our blog post here.
New emerging technologies
The age-old issue of technological innovation outpacing the ability of legislation to keep up has reared its head only one year into the GDPR’s lifecycle. Organisations are having to apply the text of the GDPR to scenarios including blockchain technology, connected and autonomous vehicles and AI techniques that simply weren’t envisaged at the time of writing. In this rapidly evolving technological landscape, the need for regularly updated, up-to-the-minute official guidance in respect of these types of scenarios has never been greater but this will be an extremely challenging demand for the regulators to satisfy.
To keep up to date with the latest legal developments as they happen, please subscribe to our data blog here.
The Hong Kong Monetary Authority (HKMA) has issued a circular to encourage authorised institutions to adopt the “Ethical Accountability Framework” (EAF) for the collection and use of personal data issued by the Office of the Privacy Commissioner for Personal Data (PCPD). A report on the EAF was published by the PCPD in October 2018 (Report), which explored ethical and fair processing of data through (i) fostering a culture of ethical data governance and (ii) addressing the personal data privacy risks brought by emerging information and communication technologies such as big data analytics, artificial intelligence and machine learning.
The EAF is expressly stated to be non-binding guidance, intended as a first step towards a privacy regime better equipped to address modern challenges. However, the HKMA’s circular arguably elevates the legal status of the EAF for authorised institutions. The HKMA is likely to incorporate the EAF into its broader supervision and inspection of authorised institutions. In particular, in construing the principles based elements of the Supervisory Policy Manual as it applies to FinTech, the EAF will undoubtedly have an influence going forward.
Tension between the value of data-processing technology and public trust
Big data has no inherent value in its raw form. Its value lies in the ability to convert that data into useful information for organisations, which can then generate knowledge or insight relating to clients or the market as a whole through data analytics or artificial intelligence. Ultimately, this insight results in competitive advantage. However, a tension exists between (i) developing data-processing technology to gain a competitive advantage; and (ii) addressing public distrust arising from the data-intensive nature of such technology.
As the Report highlights, the existing regulatory regime in Hong Kong does not adequately address the privacy and data protection risks that arise from advanced data processing. Big data analytics and artificial intelligence in particular pose challenges to the existing notification and consent based privacy legal framework. These challenges are not limited to the legal framework in Hong Kong. The privacy and data protection legislations on an international level are also ill-equipped to anticipate advances in data-intensive technology.
Data stewardship accountability
The PCPD sees the need to provide guidance on how institutions could act ethically in relation to advanced data-processing to foster public trust. It reminds institutions to be effective data stewards, not merely data custodians. Data stewards take into account the interests of all parties and consider whether the outcomes of their advanced data processing are not just legal, but also fair and just.
The PCPD also encourages data stewardship accountability, which calls for institutions to define and translate stewardship values into organisational policies, using an “ethics by design” approach. This approach requires institutions to have data protection in mind at every step and to apply the principles of privacy by default and privacy by design. Privacy by default means that once a product or service has been released to the public, the strictest privacy settings should apply by default. Privacy by design, on the other hand, requires organisations to ensure privacy is built into a system during the entire life cycle of the system. Ultimately, data stewardship should be driven by policies, culture and conduct on an organisational level, instead of technological controls.
Both the privacy by design and the privacy by default principles are mandatory requirements under the EU General Data Protection Regulation (GDPR). The legal development trend is for Asian-based privacy regulators to, whether by means of enacting new laws (e.g. India) or issuing non-mandatory best practice guidance to encourage data users to meet the higher standards under GDPR.
The PCPD encourages institutions to adopt the three “Hong Kong Values”, whilst providing the option to modify each value to better reflect their respective cultures. The three Hong Kong Values listed below are in line with the various Data Protection Principles of the Personal Data (Privacy) Ordinance (Cap. 486):
(i) The “Respectful” value requires institutions to:
- be accountable for conducting advanced data processing activities;
- take into consideration all parties that have interests in the data;
- consider the expectations of individuals that are impacted by the data use;
- make decisions in a reasonable and transparent manner; and
- allow individuals to make inquiries, obtain explanations and appeal decisions in relation to the advanced data processing activities.
(ii) The “Beneficial” value specifies that:
- where advanced data-processing activities have a potential impact on individuals, organisations should define the benefits, identify and assess the level of potential risks;
- where the activities do not have a potential impact on individuals, organisations should identify the risks and assess the materiality of such risks;
- once the organisation has identified all potential risks, it should implement appropriate ways to mitigate such risks.
(iii) The “Fair” value specifies that organisations should:
- avoid actions that are inappropriate, offensive or might constitute unfair treatment or illegal discrimination;
- regularly review and evaluate algorithms and models used in decision-making for any bias and illegal discrimination;
- minimise any data-intensive activities; and
- ensure that the advanced data-processing activities are consistent with the ethical values of the organisation.
The PCPD also encourages institutions to conduct Ethical Data Impact Assessments (EDIAs), allowing them to consider the rights and interests of all parties impacted by the collection, use and disclosure of data. A process oversight model should be in place to ensure the effectiveness of the EDIA. While this oversight could be performed by internal audit, it could also be accomplished by way of an assessment conducted externally.
International Direction of Travel
The approach outlined above is not unique to Hong Kong. In fact, at the time the EAF was announced by the PCPD in October 2018, the 40th International Conference of Data Protection and Privacy Commissioners released a Declaration on Ethics and Protection in Artificial Intelligence (Declaration) which proposes a high level framework for the regulation of artificial intelligence, privacy and data protection. The Declaration endorsed six guiding principles as “core values” to preserve human rights in the development of artificial intelligence and called for common governance principles on artificial intelligence to be established at an international level.
It is clear that there is a global trend toward ethical and fair processing of data in the application of advanced data analytics. For instance, the Monetary Authority of Singapore has formulated similar ethical principles in the use of artificial intelligence and data analytics in the financial sector, announced in November 2018. Another example is the EU’s GDPR’s specific safeguards related to the automated processing of personal data that has, or is likely to have, a significant impact on the data subject, to which the data subject has a right to object. Specifically, a data protection impact assessment assessing the impact of the envisaged processing operations must be carried out before such processing is adopted, if such processing uses new technologies and is likely to result in a high risk to the rights and freedoms of natural persons after taking into account the nature, scope, context and purposes of the processing.
Although this may appear to be a relatively minor development in Hong Kong, we see this as a step in a broader movement toward the regulation of AI and a sea change in the approach to data protection and privacy. The HKMA circular and the EAF are in line with the global data protection law developments, which are largely being led by the EU.
- The Belgian Court of Appeal has asked the European Court of Justice for help interpreting the application of the GDPR’s ‘one stop shop’.
- The case will have important implications for all multi-national companies who have chosen a lead supervisory authority in Europe for GDPR purposes.
- The results of the case will either open or close the doors for regulators across Europe to cast aside the one stop shop when looking to enforce GDPR compliance in their home jurisdiction.
You don’t expect to see Prince Harry and the GDPR in the same sentence, but it was reported this week that the Duke of Sussex has settled High Court claims against the paparazzi agency Splash News (Splash), in a case which was based partly on breaches of the GDPR. Continue reading
Connected autonomous vehicles (CAVs) are increasingly capable of creating, collecting and processing a wealth of data. However, in order for vehicle manufacturers and CAV stakeholders to access and extract the value in such data, they must do so lawfully. This is especially true in relation to personal data which is governed in the EU (and beyond) by the General Data Protection Regulation (GDPR). This post explores at a high level how CAV stakeholders can ensure compliance with the GDPR, particularly in relation to CAVs which process personal data of vehicle drivers, owners and pedestrians. Continue reading
The UK privacy regulator, the Information Commissioner’s Office (“ICO“) has recently found Her Majesty’s Revenue and Customs (“HMRC“) liable for a “significant” breach of the GDPR relating to the collection of consents with respect to biometric data. The enforcement action is a timely reminder that a higher standard of (explicit) consent is required with respect to so-called special category data (including biometric data). However, the enforcement action is also interesting because the ICO chose not to fine HMRC but to instead require certain action to be taken (namely the deletion of records), demonstrating that GDPR enforcement is not necessarily all about big monetary penalties.
- The EDPB has published guidance on the ability of online service providers to rely on the fact that processing is necessary for the performance of a contract in order to legitimise their processing of personal data.
- Although aimed specifically at online services, the guidance will nonetheless be useful for all controller organisations looking to rely on this processing condition.
- The guidance adopts a fairly narrow approach to interpretation with an objective assessment of “necessity” being required as opposed to relying on what is permitted under or required by the terms of a contract.
Lawful bases for processing under the GDPR
All processing of personal data must satisfy one of the six lawful bases for processing under Article 6(1) of the GDPR. Article 6(1)(b) applies where the processing “is necessary for the performance of a contract to which the data subject is party or in order to take steps at the request of the data subject prior to entering into a contract“.
What does the guidance say?
The guidance focusses the application of Article 6(1)(b) to online service providers and is intended to ensure that the “contractual necessity” basis is only relied upon in the context of online services where such reliance is appropriate.
In short, the guidance provides that whether processing is “necessary” for the purposes of Article 6(1)(b) will depend on whether one of the following conditions is met:
- the processing must be objectively necessary for the performance of a contract with a data subject; or
- the processing must be objectively necessary in order to take pre-contractual steps at the request of a data subject.
It is important to note that “necessity” in this context does not simply mean what is permitted under or required by the terms of a contract. In particular, the guidance indicates that where there are “realistic, less intrusive alternatives” than the processing which would achieve the same purpose, then such processing will not be deemed necessary for the purposes of Article 6(1)(b), regardless of the terms of the contract. Further, the guidance makes it clear that Article 6(1)(b) will not apply to processing which is “useful but not objectively necessary for performing the contract“, even where the processing is necessary for the data controller’s other business purposes.
Necessary for the performance of a contract
In order to rely on this limb of Article 6(1)(b), a controller will need to demonstrate the existence of a valid contract between it and the data subject, and be able to show that the processing in question is necessary in order for that particular contract to be performed.
As noted above, “necessary” in this context will require something more than a contractual condition: the processing must be in some way essential, or fundamental, such that objectively, the main purpose of the specific contract cannot be performed if the specific processing of the specific personal data does not occur.
For example, it is objectively necessary for an online service provider to process personal details such as credit card information and billing address in the context of taking payment, or for an online retailer to obtain a data subject’s home address for the purposes of delivery. However, where a data subject opts for “click and collect” delivery, it would not be objectively necessary for an online retailer to obtain the data subject’s home address (save, of course, where the home address happens to be the same as the billing address).
Other processing activities are likely to fall within a grey area. For example, the guidance notes that profiling for the purposes of tailoring or personalisation may be deemed objectively necessary in some circumstances, such as where such personalisation is an essential or expected feature of the service, but this will not always be the case.
Necessary for pre-contractual steps
To rely on this limb, the controller must be able to show that the contract in question could not be entered into without the pre-contractual processing having taken place. The controller must also be able to show that the pre-contractual steps are carried out at the request of a data subject – i.e. this limb will not apply to unsolicited marketing activities or processing carried out in the controller’s discretion.
For example, a data subject may enter their postcode on a particular company’s website to check whether a particular service is available in their area. Processing that postcode would be objectively necessary to take pre-contractual steps at the data subject’s request.
In contrast, processing for the purposes of targeted advertising would not be deemed objectively necessary for pre-contractual steps: it would be difficult to argue that no contract could be entered into in the absence of targeted advertising, or that the advertising was carried out at the data subject’s request. In particular, the guidance notes that this is the case even where such advertising funds the services, because such advertising would be separate from the objective of any contract between the controller and the data subject.
Impact for businesses
The guidance confirms a fairly narrow interpretation and objective assessment of necessity. It is helpful in the examples given but acknowledges that there will be many grey areas, for which the guidance provides no practical solution. In light of the narrow approach to interpretation, controllers may however wish to adopt a cautious approach when navigating such grey areas.