ECJ Invalidates Privacy Shield in Schrems II: What Now For Transatlantic Data Transfers?

  • The European Court of Justice (“ECJ”) has today invalidated the EU-US Privacy Shield, meaning that companies can no longer rely on this mechanism for transferring personal data from the EU to the US.
  • Companies transferring data to the US relying on the Privacy Shield (including transfers to a number of the big tech IT service providers registered with the scheme) will now need to scramble to put in place a lawful alternative.
  • In contrast, the ECJ has upheld the Standard Contractual Clauses (“SCCs”) as a valid mechanism to transfer personal data to third countries but subject to a fairly significant sting in the tail.
  • Importantly, the ECJ has pointed out that both data exporter companies and regulators must ensure that there are mechanisms to suspend or prohibit transfers to third countries where there is a conflict between the SCCs and the laws of that third country.
  • In practice, this appears to mean that companies need to undertake a level of due diligence prior to any transfer of personal data to a third country where the SCCs are being used, and that recipients of that data have an obligation to tell the exporter where their local laws (for example because of surveillance powers in their jurisdiction) mean that they cannot comply fully with the SCCs.
  • Given the ECJ’s comments on the adequacy of the US regime, it remains to be seen how businesses can undertake such due diligence to reach a conclusion that data is sufficiently protected when being sent to the US, even using the SCCs.

Continue reading

POPI commencement date announced – here’s what you need to know (South Africa)

The President has (finally!) announced that the operative provisions of the Protection of Personal Information Act, 2013 will come into force on 1 July 2020.

What is POPI?

POPI is South Africa’s data privacy law and it stands for the Protection of Personal Information Act, 2013.  It is sometimes also referred to as POPIA.  It governs when and how organisations collect, use, store, delete and otherwise handle personal information.

What is personal information under POPI?

Generally speaking, personal information is any information that can be used to personally identify a natural or juristic (i.e. organisations) person.  This includes name, identity number, age and addresses.

Who does POPI apply to?

POPI applies to all local and foreign organisations processing (i.e. collecting, using or otherwise handling) personal information in South Africa.

What does this announcement mean for your organisation?

You will have 12 months from 1 July to become compliant.  This means that although there will be no sanctions for non-compliance, you must work towards compliance.  For most organisations this is no easy feat as it requires an analysis of all personal information within your organisation, where you get it from and what you do with it.  We recommend that organisations that have not yet started becoming compliant, do so as soon as possible or they could face fines, penalties and other adverse consequences in future.  It is also a good time to commence a data privacy awareness programme within your organisation.

What is POPI compliance?

You will need to establish measures that ensure that you only collect, use, store, delete and otherwise handle personal information in permitted ways and that it is appropriately protected from unauthorised access or loss.  The measures that each organisation employs will be different but in practice it will mean more policies and procedures for your organisation and you will need to inculcate a culture of data protection in your organisation.

Does POPI provide any benefit to businesses?

POPI provides the opportunity to analyse and have more control over the data handled within your organisation and to better understand its purposes.  As data is an increasingly valuable resource, better data management can increase the efficiency and effectiveness of any business.

What does POPI mean for consumers?

Consumers will benefit from POPI’s requirements that their personal information must be protected and that it can only be collected or handled where there is a lawful justification for doing so.  POPI gives consumers specific rights in respect of organisations handling their personal information and it gives consumers greater control over their personal information.  Consumers are informed about what personal information is collected, by who and why so that consumers are able to make informed decisions.

Who regulates POPI?

POPI is regulated by the Information Regulator.

What are the fines and penalties for non-compliance?

The fines and penalties vary depending on the offence, with a maximum of 10 years in prison or a R10 million fine.

Does POPI add anything to my constitutional right to privacy?

Every person has a constitutional right to privacy, which has many aspects (including privacy in the home, private communications and private information about a person).  POPI gives practical effect to that right insofar as it relates to personal information handled by organisations.  It provides a direct mechanism through which that aspect of the right can be enforced.

Is POPI different from the GDPR?

POPI is similar to the EU’s data privacy law, called the General Data Protection Regulation but it differs in some respects.  The main difference is that POPI regulates corporate personal information, where appropriate, whereas the GDPR does not.

For more information contact Rohan Isaacs and Tatum Govender, Technology and Privacy team, Herbert Smith Freehills South Africa LLP.

Rohan Isaacs

Rohan Isaacs
Consultant, Johannesburg
+27 10 500 2667

Tatum Govender

Tatum Govender
Associate, Johannesburg
+27 10 500 2665

 

Could the UK move away from GDPR to help foster innovation?

Publicly the UK government is pursuing an adequacy decision from Europe regarding data protection and privacy regulations, but recent comments from ‘Number 10’ could be interpreted as saying the UK may be comfortable pursuing a different privacy path to the EU from the end of December 2020. The issue is about more than making changes to laws which were six years in the making. It has the potential to influence the way the UK develops a number of industries, particularly in the world of artificial intelligence. Why? Simply because the opportunity to prioritise innovation may be a good move to drive British business, but it could challenge the UK’s longstanding approach to protecting the digital privacy of its citizens.

It’s an issue our global head of data and privacy, Miriam Everett, raised with Finextra. You can read what she had to say here.

For more information, or if you have any queries, please contact Miriam Everett.

Miriam Everett

Miriam Everett
Partner, Head of Data Protection and Privacy, London
+44 20 7466 2378

EDPB Adopts Final Guidelines on GDPR Extra-territoriality

Almost exactly a year after publishing its draft version, the EDPB has adopted its final guidelines on Article 3 of the GDPR and the extra-territorial scope of the legislation. The adopted guidelines don’t differ substantially from the consultation draft but include a number of clarifications and new examples. Some of the key takeaways are:

  • Article 3 aims to determine whether a particular processing activity is within the scope of the GDPR and not whether an entity is within the scope of the GDPR (i.e. a non-EU controller can be caught with respect to some data and processing but that does not necessarily mean the entire organisation and all its data is subject to the GDPR);
  • Article 3(2) only covers processing where the controller or processor is intentionally targeting individuals; inadvertent or incidental contact with data subjects within the European Union is not enough to trigger this Article (i.e. confirmation that the capture of non-EU people’s data whilst they happen to be on holiday in the EU is probably not going to trigger Article 3(2)); and
  • A new section of guidance concludes that where a controller is consider under Article 3(2) to be “targeting” data subjects in the European Union, that any processor engaged by the controller in respect of such processing will also be caught by Article 3(2) and therefore subject to the GDPR (i.e. one of the few examples of when a processor can be caught by Article 3(2)).

Whilst helpful to have the final guidance, it is important to note that further clarity is still required in some areas, in particular the interplay between international data transfers and the scope of Article 3. Continue reading

Facial Recognition Technology and Data Protection Law: the ICO’s view

The Information Commissioner’s Office in the UK (ICO) has announced an investigation into the use of facial recognition technology following a string of high profile uses. Prior to the results of this investigation, companies using facial recognition technology should:

  • undertake a balancing test to ensure proportionality in the use of such technology, acknowledging its intrusiveness;
  • ensure that appropriate documentation, including data protection impact assessments and policy documentation are developed and maintained; and
  • monitor use of the technology to eliminate any potential bias in the algorithms.

The use of Live Facial Recognition Technology (LFR) in public places has increased considerably over the last few years by the police, other law enforcement agencies and also by the private sector. This increase is causing growing concern amongst regulators, government and ethics committees relating to the serious risks it poses to privacy given the sensitive nature of the processing involved, the potential volume of people affected and the level of intrusion into privacy it has the capacity to create. Moves are now being made to address the use of this technology and put a legal framework in place in a bid to mitigate the risks it poses.

ICO launches facial recognition Investigation

The Information Commissioner, Elizabeth Denham, published a blog on 9th July 2019 entitled ”Live Facial Technology – data protection law applies” (available at: https://ico.org.uk/about-the-ico/news-and-events/news-and-blogs/2019/07/blog-live-facial-recognition-technology-data-protection-law-applies/) and announced that the Information Commissioner’s Office (ICO) is conducting investigations into the use of LFR in the King’s Cross area of London.

The ICO investigation follows a spate of trials of LFR going on at various sites across the country including Meadowfield Shopping Centre in Liverpool, Liverpool’s World Museum, Manchester’s Trafford Centre and King’s Cross where the technology has been used by police forces primarily but also in conjunction with site owners to identify individuals at risk or linked to criminal activity.

The ICO was also recently called to advise the judge on data protection law in the case of R (Bridges) v Chief Constable of South Wales Police (SWP).

The ICO’s principle concern is that organisations utilising facial recognition technology, including the police, be able to provide demonstrable evidence when deploying this technology that it is ‘necessary, proportionate and effective considering its invasiveness.’

In addition, she emphasises that police forces must comply with data protection law which currently includes the GDPR and the Data Protection Act 2018, paying particular attention to the compilation of watch lists, the selection of images used and the need to remove inherent bias in the technology to prevent false-positive matches from certain ethnic groups.

ICO Guidance

The ICO has issued a guidance for police forces considering the deployment of LFR which consists of four basic instructions:

  1. Conduct a Data Protection Impact Assessment (DPIA) before any deployment of LFR and submit these to the ICO for consideration to ensure timely discussion on mitigation of risks.
  2. Create a separate policy document to cover the use of LFR which establishes for what type of circumstances, in what types of places, at what times and in what way the technology will be used.
  3. Monitor algorithms within the software to ensure that no race or sex bias is created.
  4. Read the ICO Guide to Law Enforcement Processing (available at: https://ico.org.uk/for-organisations/guide-to-data-protection/guide-to-law-enforcement-processing/) which deals with Part 3 of the DPA and highlights individual rights (including the right to be informed, the right of access, the right to rectification, the right to erasure, the right to restriction and the right not to be subject to automated decision-making) and the importance of accountability and governance.

This Guidance should be appropriately considered by any business considering the use of this type of technology.

It has been a critical moment for regulators to begin to scrutinise LFR and provide guidance given the inherent risk of abuse of data protection and privacy laws it poses and the results of the ICO’s investigation are anticipated with great interest. It is likely that greater vigilance will be called for in the future, especially given the expected rise in the use of this technology and when new uses of the technology come into play.

LFR technology has already been developed, for example, that uses a combination of facial recognition technology and people’s mobile phones which may be used to speed up the immigration process. It is evident that LFR is potentially an extremely useful tool for the enhancement of public safety but the accuracy of images and the elimination of bias in algorithms will undoubtedly be critical to ensure that this technology can be adopted in the mainstream and in compliance with applicable privacy legislation.

Miriam Everett

Miriam Everett
Partner, London
+44 20 7466 2378

SHAREHOLDERS ACTIVISM FOCUSES ON PRIVACY ISSUES

  • With privacy issues these days commonly featuring as a board legal agenda item, recent shareholder activity at Amazon has shown that privacy is also at the forefront of shareholders’ minds.
  • A group of Amazon shareholders sought to prevent the company from selling its facial recognition technology because of privacy concerns.
  • Although the Amazon motions were defeated, they demonstrate that shareholders are willing to try and hold companies to account over privacy concerns.
  • The action also highlights a growing trend for interesting and innovative uses of privacy rights and regulation as a tool.

Continue reading

The internet, to regulate or not to regulate? House of Lords calls for new digital regulator

The House of Lords Select Committee on Communications has published a report recommending a new approach to, and comprehensive and holistic strategy for, regulating the digital environment. Unsurprisingly the report concludes that the “digital world has not kept pace with its role in our lives” and, in particular, it calls for the establishment of a new ‘Digital Authority’ to provide oversight, as well as instruct and co-ordinate existing regulators. While over a dozen regulators have partial responsibility for regulating the digital market, no one regulator has complete oversight. The Committee argues that this has resulted in a digital environment that is fragmented, with gaps and overlaps, as well as  a regulatory infrastructure that is incapable of responding to the challenges that the modern online world presents. Continue reading