ICO OPENS CONSULTATION ON DATA SUBJECT ACCESS RIGHTS

The ICO (the UK privacy regulator) has published draft guidance on the right of individuals under the GDPR to access their data. Key takeaways include:

  • An acknowledgement that subject access requests can be burdensome, with a requirement to ‘make extensive efforts’ to locate and retrieve information and confirmation that a significant burden does not make a request ‘excessive’;
  • A warning against companies asking for proof of identity as a matter of course when there is no reason to doubt the requestor’s identity; and
  • Confirmation that it is possible to consider the intention or motive behind a subject access request when assessing whether or not it is possible to refuse to comply.

Continue reading

EDPB Adopts Final Guidelines on GDPR Extra-territoriality

Almost exactly a year after publishing its draft version, the EDPB has adopted its final guidelines on Article 3 of the GDPR and the extra-territorial scope of the legislation. The adopted guidelines don’t differ substantially from the consultation draft but include a number of clarifications and new examples. Some of the key takeaways are:

  • Article 3 aims to determine whether a particular processing activity is within the scope of the GDPR and not whether an entity is within the scope of the GDPR (i.e. a non-EU controller can be caught with respect to some data and processing but that does not necessarily mean the entire organisation and all its data is subject to the GDPR);
  • Article 3(2) only covers processing where the controller or processor is intentionally targeting individuals; inadvertent or incidental contact with data subjects within the European Union is not enough to trigger this Article (i.e. confirmation that the capture of non-EU people’s data whilst they happen to be on holiday in the EU is probably not going to trigger Article 3(2)); and
  • A new section of guidance concludes that where a controller is consider under Article 3(2) to be “targeting” data subjects in the European Union, that any processor engaged by the controller in respect of such processing will also be caught by Article 3(2) and therefore subject to the GDPR (i.e. one of the few examples of when a processor can be caught by Article 3(2)).

Whilst helpful to have the final guidance, it is important to note that further clarity is still required in some areas, in particular the interplay between international data transfers and the scope of Article 3. Continue reading

Facial Recognition Technology and Data Protection Law: the ICO’s view

The Information Commissioner’s Office in the UK (ICO) has announced an investigation into the use of facial recognition technology following a string of high profile uses. Prior to the results of this investigation, companies using facial recognition technology should:

  • undertake a balancing test to ensure proportionality in the use of such technology, acknowledging its intrusiveness;
  • ensure that appropriate documentation, including data protection impact assessments and policy documentation are developed and maintained; and
  • monitor use of the technology to eliminate any potential bias in the algorithms.

The use of Live Facial Recognition Technology (LFR) in public places has increased considerably over the last few years by the police, other law enforcement agencies and also by the private sector. This increase is causing growing concern amongst regulators, government and ethics committees relating to the serious risks it poses to privacy given the sensitive nature of the processing involved, the potential volume of people affected and the level of intrusion into privacy it has the capacity to create. Moves are now being made to address the use of this technology and put a legal framework in place in a bid to mitigate the risks it poses.

ICO launches facial recognition Investigation

The Information Commissioner, Elizabeth Denham, published a blog on 9th July 2019 entitled ”Live Facial Technology – data protection law applies” (available at: https://ico.org.uk/about-the-ico/news-and-events/news-and-blogs/2019/07/blog-live-facial-recognition-technology-data-protection-law-applies/) and announced that the Information Commissioner’s Office (ICO) is conducting investigations into the use of LFR in the King’s Cross area of London.

The ICO investigation follows a spate of trials of LFR going on at various sites across the country including Meadowfield Shopping Centre in Liverpool, Liverpool’s World Museum, Manchester’s Trafford Centre and King’s Cross where the technology has been used by police forces primarily but also in conjunction with site owners to identify individuals at risk or linked to criminal activity.

The ICO was also recently called to advise the judge on data protection law in the case of R (Bridges) v Chief Constable of South Wales Police (SWP).

The ICO’s principle concern is that organisations utilising facial recognition technology, including the police, be able to provide demonstrable evidence when deploying this technology that it is ‘necessary, proportionate and effective considering its invasiveness.’

In addition, she emphasises that police forces must comply with data protection law which currently includes the GDPR and the Data Protection Act 2018, paying particular attention to the compilation of watch lists, the selection of images used and the need to remove inherent bias in the technology to prevent false-positive matches from certain ethnic groups.

ICO Guidance

The ICO has issued a guidance for police forces considering the deployment of LFR which consists of four basic instructions:

  1. Conduct a Data Protection Impact Assessment (DPIA) before any deployment of LFR and submit these to the ICO for consideration to ensure timely discussion on mitigation of risks.
  2. Create a separate policy document to cover the use of LFR which establishes for what type of circumstances, in what types of places, at what times and in what way the technology will be used.
  3. Monitor algorithms within the software to ensure that no race or sex bias is created.
  4. Read the ICO Guide to Law Enforcement Processing (available at: https://ico.org.uk/for-organisations/guide-to-data-protection/guide-to-law-enforcement-processing/) which deals with Part 3 of the DPA and highlights individual rights (including the right to be informed, the right of access, the right to rectification, the right to erasure, the right to restriction and the right not to be subject to automated decision-making) and the importance of accountability and governance.

This Guidance should be appropriately considered by any business considering the use of this type of technology.

It has been a critical moment for regulators to begin to scrutinise LFR and provide guidance given the inherent risk of abuse of data protection and privacy laws it poses and the results of the ICO’s investigation are anticipated with great interest. It is likely that greater vigilance will be called for in the future, especially given the expected rise in the use of this technology and when new uses of the technology come into play.

LFR technology has already been developed, for example, that uses a combination of facial recognition technology and people’s mobile phones which may be used to speed up the immigration process. It is evident that LFR is potentially an extremely useful tool for the enhancement of public safety but the accuracy of images and the elimination of bias in algorithms will undoubtedly be critical to ensure that this technology can be adopted in the mainstream and in compliance with applicable privacy legislation.

Miriam Everett
Miriam Everett
Partner, London
+44 20 7466 2378

The Encryption debate is far from ‘going dark’

Shortly after the release of the communiqué from the most recent ministerial meetings of the ‘Five Countries’ security alliance — Australia, Canada, New Zealand, the UK and the US — at the end of July, we warned that the issue of the use of, and access to, encrypted services and technologies ‘remains front of mind for the alliance and further legislative or regulatory action in the Five Countries may follow’.

This week, It became clear that three of the Five Countries planned to follow through. On 4 October 2019, representatives of the Australian, UK and US governments planned to release:

Continue reading

CALCULATION GUIDELINES ON GDPR FINES IN GERMANY

In our latest report, we informed you about new developments regarding imposed sanctions by Data Protection Authorities (“DPA”) in Germany and Austria and about a model for calculating fines imposed under the General Data Protection Regulation (“GDPR”) proposed by the Conference of the German “Independent Data Protection Supervisory Authorities of the Federal Government and the States” (Datenschutzkonferenz – “DSK”). The DSK is the joint coordination body of the German data protection authorities.

Continue reading

‘MEGA-FINES’ AND COMPENSATION – HOW MIGHT COMPANIES BE AFFECTED? DEVELOPMENTS IN DATA PROTECTION LAW SEPTEMBER 2019

In this update, we provide you with a brief summary of two recent developments in relation to sanctions imposed under the General Data Protection Regulation (“GDPR”).

  • Firstly, the Berlin Data Protection Authority (“Berlin DPA”) recently announced its willingness to impose multimillion-euro fines for breaches of the GDPR. This shows that also in Germany significant fines can no longer be ruled out. It appears that Berlin DPA is following in the footsteps of the French Data Protection Authority (“CNIL”) and the UK Information Commissioner’s Office (“ICO”) which have both previously imposed fines in the millions.
  • Secondly, for the first time a court has awarded immaterial damages compensation for a GDPR breach in Austria.

We take a look at what this means for companies and the developments that have been made since the implementation of the GDPR.

Continue reading

GDPR used to gain access to fiancée’s personal data: Exposing vulnerabilities in Data Subject Access Requests

  • A recent test DSAR has demonstrated companies’ differing approaches to DSAR compliance
  • Despite the DSAR being made by a third party on behalf of the data subject, it is clear companies are uncertain regarding when or how they should ask for ID verification
  • ICO guidance urges data controllers to be satisfied that any third party making a DSAR is entitled to act on behalf of the individual data subject

Background

Article 15 of the GDPR gives data subjects the right to obtain a copy of their personal data held by data controllers who process their personal data.  Over the course of the past year, we’ve seen increasingly innovative uses of this right, as demonstrated recently by James Pavur, a researcher at the University of Oxford. Continue reading

Storming the Breaches: DCMS releases Cyber Security Breaches Survey 2019

Cyber-attacks are a continuous threat to both businesses and charities. From the Cyber Security Breaches Survey 2019 (available here as a PDF), we can see that fewer businesses are identifying breaches than in previous years, but the ones that are identifying breaches are typically experiencing more of them. Approximately 32% of businesses and 22% of charities report having cyber security breaches/attacks in the last 12 months. The most common type of cyber security breaches reported are: Continue reading

A Clearer Roadmap to Recovery: the roles of NCSC and ICO clarified at CYBERUK

The National Cyber Security Centre (NCSC) and the Information Commission Office (ICO) have clarified their roles in relation to breaches of cyber security.  NCSC manages cyber incidents at a national level to prevent harm being caused to both victims and the UK overall. It helps manage the response at a governmental level and seeks to ensure that lessons are learned to help deter future attacks. The ICO is the independent regulator for enforcing and monitoring data protection legislation and the competent authority for Digital Service Providers under the Network and Information Systems (NIS) Directive. The ICO is the first port of call for organisations who have suffered a breach of cyber security. Continue reading