ASIA DATA PROTECTION UPDATE

New security assessment rules, which are applicable to the transfer of both important data and personal information outside of China, have been issued for public consultation.

The Cybersecurity Administration of China (“CAC“) released a draft of the Measures for Security Assessment of Cross-border Transfer of Data (“Draft Measures“) for public consultation on 29 October 2021. The deadline for the public to submit comments on the Draft Measures is 28 November 2021. Continue reading

Mandatory data breach notification has been introduced in Singapore, with more changes to follow

Some of the key changes to the Personal Data Protection Act 2012 (“PDPA”) took effect on 1 February 2021. These include a mandatory breach notification regime and new consent exceptions, including an exception which may apply if an organisation has legitimate interests in the collection, use or disclosure of the personal data and the legitimate interests of the organisation or other person outweigh any likely adverse effect to the individual.

The Personal Data Protection (Amendment) Bill was passed by the Singapore Parliament on 2 November 2020, with the changes set to take effect in phases. The first phase of these changes took effect from 1 February 2021.

Changes which have already taken effect as of 1 February 2021

1. Mandatory breach notification

One of the key changes which has now taken effect is the introduction of the mandatory data breach notification requirement.  If a data breach is notifiable, the Personal Data Protection Commission (“PDPC”) must be notified. If certain reporting thresholds are met, the affected individuals must also be notified. The new provisions require that:

  • once an organisation has grounds to believe that a data breach has occurred, the organisation is to carry out an assessment of the data breach in a reasonable and expeditious manner to determine whether the data breach is a notifiable data breach. Generally, the assessment should be completed within 30 calendar days of when the organisation first became aware that a data breach may have taken place.
  • a data breach is notifiable to the PDPC if the data breach: (a) results in, or is likely to result in, significant harm to an affected individual; or (b) is, or is likely to be, of a significant scale (i.e. affecting 500 or more individuals). The organisation must notify the PDPC of the breach as soon as it is practicable to do so and, in any event, no later than 72 hours after establishing that the data breach is notifiable.
  • the organisation must also notify affected individuals of the data breach once the organisation has determined that the data breach is likely to result in significant harm to any individuals to whom the information relates, as soon as it is practicable to provide the individuals with the notification. This will allow the affected individuals the opportunity to take steps to protect themselves from the risks of harm or impact resulting from the data breach (e.g. review suspicious account activities, cancel credit cards, and change passwords).

2. New deemed consent and consent exceptions

Consent is required for collecting, using or disclosing an individual’s personal data. The individual must also be notified of the purpose(s) for which an organisation is collecting, using or disclosing the individual’s personal data on or before such collection, use or disclosure of the personal data. Consent may be given expressly or impliedly by individuals. An individual may also be deemed to have given consent under the PDPA in 3 ways: (a) deemed consent by conduct; (b) deemed consent by contractual necessity; or (c) deemed consent by notification, (as the case may be).

In certain circumstances, the amended PDPA also allows an organisation to collect, use and disclose personal data without the individual’s consent. These exceptions may apply when:

  • the organisation or another person has a legitimate interest in the collection, use or disclosure of the personal data (i.e. the legitimate interest exception);
  • the organisation is a party or prospective party to a business asset transaction with another organisation (i.e. the business asset transaction exception);
  • the organisation is using the personal data for the purposes of business improvement (i.e. the business improvement exception); and
  • the organisation is using the personal data for the purposes of research (i.e. the research exception).


Changes which will take effect later

The following changes have not yet taken effect as of 1 February 2021, but are expected to become effective in the near future:

3. Increased financial penalties for contravention of PDPA

The maximum penalty imposed on organisations for breaches of certain key obligations under the PDPA will be increased to S$1 million or 10% of the organisation’s annual turnover in Singapore, whichever is higher. The increased financial penalties are expected to take effect on a future date to be notified, and no earlier than 1 February 2022.

4. Right to data portability

The recent amendments have also introduced provisions which require an organisation to, at the request of an individual, transmit an individual’s personal data that is in the organisation’s possession or under its control to another organisation in accordance with the prescribed requirements in the PDPA. These provisions, which are found under the new Part VIB[1], have yet to come into effect.

For details on the major changes to the PDPA, please refer to our previous e-bulletin “Singapore data privacy law updates 2020” (click here).

[1] Part VIB has not been added to the PDPA because this Part has not come into effect yet.

Mark Robinson
Mark Robinson
Partner, Singapore
+65 6868 9808
Peggy Chow
Peggy Chow
Senior Associate, Singapore
+65 6868 8054
Sandra Tsao
Sandra Tsao
Of Counsel, Singapore
+65 6812 1353

The ICO publishes its Age–Appropriate Design Code of Practice for online services

Following a public consultation on its draft code of practice with parents, children, schools, children’s campaign groups, developers, tech and gaming companies and online service providers which closed on 31 May 2019, the Information Commissioner’s Office (ICO) submitted its Age-appropriate design Code of Practice on 12 November 2019 but due to restrictions in the pre-election period it was not permitted to be published until 23 January 2020. Continue reading

Facial Recognition Technology and Data Protection Law: the ICO’s view

The Information Commissioner’s Office in the UK (ICO) has announced an investigation into the use of facial recognition technology following a string of high profile uses. Prior to the results of this investigation, companies using facial recognition technology should:

  • undertake a balancing test to ensure proportionality in the use of such technology, acknowledging its intrusiveness;
  • ensure that appropriate documentation, including data protection impact assessments and policy documentation are developed and maintained; and
  • monitor use of the technology to eliminate any potential bias in the algorithms.

The use of Live Facial Recognition Technology (LFR) in public places has increased considerably over the last few years by the police, other law enforcement agencies and also by the private sector. This increase is causing growing concern amongst regulators, government and ethics committees relating to the serious risks it poses to privacy given the sensitive nature of the processing involved, the potential volume of people affected and the level of intrusion into privacy it has the capacity to create. Moves are now being made to address the use of this technology and put a legal framework in place in a bid to mitigate the risks it poses.

ICO launches facial recognition Investigation

The Information Commissioner, Elizabeth Denham, published a blog on 9th July 2019 entitled ”Live Facial Technology – data protection law applies” (available at: https://ico.org.uk/about-the-ico/news-and-events/news-and-blogs/2019/07/blog-live-facial-recognition-technology-data-protection-law-applies/) and announced that the Information Commissioner’s Office (ICO) is conducting investigations into the use of LFR in the King’s Cross area of London.

The ICO investigation follows a spate of trials of LFR going on at various sites across the country including Meadowfield Shopping Centre in Liverpool, Liverpool’s World Museum, Manchester’s Trafford Centre and King’s Cross where the technology has been used by police forces primarily but also in conjunction with site owners to identify individuals at risk or linked to criminal activity.

The ICO was also recently called to advise the judge on data protection law in the case of R (Bridges) v Chief Constable of South Wales Police (SWP).

The ICO’s principle concern is that organisations utilising facial recognition technology, including the police, be able to provide demonstrable evidence when deploying this technology that it is ‘necessary, proportionate and effective considering its invasiveness.’

In addition, she emphasises that police forces must comply with data protection law which currently includes the GDPR and the Data Protection Act 2018, paying particular attention to the compilation of watch lists, the selection of images used and the need to remove inherent bias in the technology to prevent false-positive matches from certain ethnic groups.

ICO Guidance

The ICO has issued a guidance for police forces considering the deployment of LFR which consists of four basic instructions:

  1. Conduct a Data Protection Impact Assessment (DPIA) before any deployment of LFR and submit these to the ICO for consideration to ensure timely discussion on mitigation of risks.
  2. Create a separate policy document to cover the use of LFR which establishes for what type of circumstances, in what types of places, at what times and in what way the technology will be used.
  3. Monitor algorithms within the software to ensure that no race or sex bias is created.
  4. Read the ICO Guide to Law Enforcement Processing (available at: https://ico.org.uk/for-organisations/guide-to-data-protection/guide-to-law-enforcement-processing/) which deals with Part 3 of the DPA and highlights individual rights (including the right to be informed, the right of access, the right to rectification, the right to erasure, the right to restriction and the right not to be subject to automated decision-making) and the importance of accountability and governance.

This Guidance should be appropriately considered by any business considering the use of this type of technology.

It has been a critical moment for regulators to begin to scrutinise LFR and provide guidance given the inherent risk of abuse of data protection and privacy laws it poses and the results of the ICO’s investigation are anticipated with great interest. It is likely that greater vigilance will be called for in the future, especially given the expected rise in the use of this technology and when new uses of the technology come into play.

LFR technology has already been developed, for example, that uses a combination of facial recognition technology and people’s mobile phones which may be used to speed up the immigration process. It is evident that LFR is potentially an extremely useful tool for the enhancement of public safety but the accuracy of images and the elimination of bias in algorithms will undoubtedly be critical to ensure that this technology can be adopted in the mainstream and in compliance with applicable privacy legislation.

Miriam Everett
Miriam Everett
Partner, London
+44 20 7466 2378

A year in the life of GDPR: Statistics and stories from the ICO

The introduction of the GDPR on 25 May 2018 caused a widespread re-think about data protection and privacy rights. From individuals being more aware of their rights, to corporate institutions working hard to ensure compliance and avoid the hefty new penalties the regulations can impose, data protection has undoubtedly been at the forefront of people’s minds since May 2018. At the heart of these changes, from the UK’s perspective, is the Information Commissioner’s Office (the “ICO“), who are the supervisory authority responsible for overseeing all data protection concerns and processing based in the UK. A year after coming into effect, we’ve taken a look at the impact that the GDPR has had on the ICO and its activities, looking at key differences between the years before and after the regulations were introduced. Continue reading

Draft Data Protection Bill published – no major surprises for businesses

Following its Second Reading in the House of Lords, on 22 November 2017 the draft Data Protection Bill (the “Bill”) passed the Committee Stage and will next be considered at the Report Stage on 11 December 2017. The Bill was initially published on 14 September and once finalised it will repeal the current Data Protection Act 1998 (the “DPA”). The Bill implements various national derogations permitted by the GDPR and also extends the GDPR standards to certain areas of data processing outside EU competence. The Bill also provides for the continuation of the Information Commissioner’s role. Continue reading