How the ICO proposes to regulate and enforce data protection legislation

The Information Commissioner’s Office in the UK (the “ICO”) last week published, for consultation, draft statutory guidance setting out how it will regulate and enforce data protection legislation in the UK. The consultation sets out all of the ICO’s key powers (including information notices, assessment notices, enforcement notices and penalty notices). Continue reading

High Court says bank need not comply with numerous and repetitive DSARs which were being used for a collateral purpose

The High Court has dismissed a Part 8 claim against a bank for allegedly failing to provide an adequate response to the claimant’s Data Subject Access Requests (DSARs). This is a noteworthy decision for financial institutions, particularly those with a strong retail customer base, as it highlights the robust approach that the court is willing to take where it suspects the tactical deployment of DSARs against the institution: Lees v Lloyds Bank plc [2020] EWHC 2249 (Ch).

Continue reading

Webinar: A cyber and data security perspective on Operational Resilience in Financial Services

Financial services regulators are expecting firms to prevent, respond to, recover and learn from operational disruption. As Christine Lagarde, President of the European Central Bank, has warned, a combined cyber attack on important banks could trigger financial instability.

In this webinar our experts in financial services, cyber and data security, data privacy, outsourcing and digital disruption, together with Deloitte’s Customer Breach Support team, share their experience of operational disruption.

Continue reading

Hong Kong data protection reform: what you need to know

On 20 January 2020, the Constitutional and Mainland Affairs Bureau (CMAB) together with the Privacy Commissioner for Personal Data (Privacy Commissioner), published a consultation paper raising important data protection issues and proposing possible amendments to the Personal Data (Privacy) Ordinance (Cap. 486) (PDPO), after having reviewed the existing data protection regime in Hong Kong. These include possibly introducing a mandatory data breach notification mechanism, requiring data users to specify a retention period for personal data collected, raising the sanctioning powers of the Privacy Commissioner as well as potentially making data processors more accountable.

Continue reading

HKMA takes first step towards regulating the use of big data analytics and artificial intelligence in FinTech

Authors: Hannah Cassidy, Jeremy Birch, Sheena Loi and Peggy Chow

The Hong Kong Monetary Authority (HKMA) has issued a circular to encourage authorised institutions to adopt the “Ethical Accountability Framework” (EAF) for the collection and use of personal data issued by the Office of the Privacy Commissioner for Personal Data (PCPD). A report on the EAF was published by the PCPD in October 2018 (Report), which explored ethical and fair processing of data through (i) fostering a culture of ethical data governance and (ii) addressing the personal data privacy risks brought by emerging information and communication technologies such as big data analytics, artificial intelligence and machine learning.

The EAF is expressly stated to be non-binding guidance, intended as a first step towards a privacy regime better equipped to address modern challenges. However, the HKMA’s circular arguably elevates the legal status of the EAF for authorised institutions. The HKMA is likely to incorporate the EAF into its broader supervision and inspection of authorised institutions. In particular, in construing the principles based elements of the Supervisory Policy Manual as it applies to FinTech, the EAF will undoubtedly have an influence going forward.

Tension between the value of data-processing technology and public trust

Big data has no inherent value in its raw form. Its value lies in the ability to convert that data into useful information for organisations, which can then generate knowledge or insight relating to clients or the market as a whole through data analytics or artificial intelligence. Ultimately, this insight results in competitive advantage. However, a tension exists between (i) developing data-processing technology to gain a competitive advantage; and (ii) addressing public distrust arising from the data-intensive nature of such technology.

As the Report highlights, the existing regulatory regime in Hong Kong does not adequately address the privacy and data protection risks that arise from advanced data processing. Big data analytics and artificial intelligence in particular pose challenges to the existing notification and consent based privacy legal framework. These challenges are not limited to the legal framework in Hong Kong. The privacy and data protection legislations on an international level are also ill-equipped to anticipate advances in data-intensive technology.

Back

Data stewardship accountability

The PCPD sees the need to provide guidance on how institutions could act ethically in relation to advanced data-processing to foster public trust. It reminds institutions to be effective data stewards, not merely data custodians. Data stewards take into account the interests of all parties and consider whether the outcomes of their advanced data processing are not just legal, but also fair and just.

The PCPD also encourages data stewardship accountability, which calls for institutions to define and translate stewardship values into organisational policies, using an “ethics by design” approach. This approach requires institutions to have data protection in mind at every step and to apply the principles of privacy by default and privacy by design. Privacy by default means that once a product or service has been released to the public, the strictest privacy settings should apply by default. Privacy by design, on the other hand, requires organisations to ensure privacy is built into a system during the entire life cycle of the system. Ultimately, data stewardship should be driven by policies, culture and conduct on an organisational level, instead of technological controls.

Both the privacy by design and the privacy by default principles are mandatory requirements under the EU General Data Protection Regulation (GDPR). The legal development trend is for Asian-based privacy regulators to, whether by means of enacting new laws (e.g. India) or issuing non-mandatory best practice guidance to encourage data users to meet the higher standards under GDPR.

Back

Data stewardship values

The PCPD encourages institutions to adopt the three “Hong Kong Values”, whilst providing the option to modify each value to better reflect their respective cultures. The three Hong Kong Values listed below are in line with the various Data Protection Principles of the Personal Data (Privacy) Ordinance (Cap. 486):

(i)   The “Respectful” value requires institutions to:

  • be accountable for conducting advanced data processing activities;
  • take into consideration all parties that have interests in the data;
  • consider the expectations of individuals that are impacted by the data use;
  • make decisions in a reasonable and transparent manner; and
  • allow individuals to make inquiries, obtain explanations and appeal decisions in relation to the advanced data processing activities.

(ii)   The “Beneficial” value specifies that:

  • where advanced data-processing activities have a potential impact on individuals, organisations should define the benefits, identify and assess the level of potential risks;
  • where the activities do not have a potential impact on individuals, organisations should identify the risks and assess the materiality of such risks;
  • once the organisation has identified all potential risks, it should implement appropriate ways to mitigate such risks.

(iii)   The “Fair” value specifies that organisations should:

  • avoid actions that are inappropriate, offensive or might constitute unfair treatment or illegal discrimination;
  • regularly review and evaluate algorithms and models used in decision-making for any bias and illegal discrimination;
  • minimise any data-intensive activities; and
  • ensure that the advanced data-processing activities are consistent with the ethical values of the organisation.

The PCPD also encourages institutions to conduct Ethical Data Impact Assessments (EDIAs), allowing them to consider the rights and interests of all parties impacted by the collection, use and disclosure of data. A process oversight model should be in place to ensure the effectiveness of the EDIA. While this oversight could be performed by internal audit, it could also be accomplished by way of an assessment conducted externally.

Back

International Direction of Travel

The approach outlined above is not unique to Hong Kong. In fact, at the time the EAF was announced by the PCPD in October 2018, the 40th International Conference of Data Protection and Privacy Commissioners released a Declaration on Ethics and Protection in Artificial Intelligence (Declaration) which proposes a high level framework for the regulation of artificial intelligence, privacy and data protection. The Declaration endorsed six guiding principles as “core values” to preserve human rights in the development of artificial intelligence and called for common governance principles on artificial intelligence to be established at an international level.

It is clear that there is a global trend toward ethical and fair processing of data in the application of advanced data analytics. For instance, the Monetary Authority of Singapore has formulated similar ethical principles in the use of artificial intelligence and data analytics in the financial sector, announced in November 2018. Another example is the EU’s GDPR’s specific safeguards related to the automated processing of personal data that has, or is likely to have, a significant impact on the data subject, to which the data subject has a right to object. Specifically, a data protection impact assessment assessing the impact of the envisaged processing operations must be carried out before such processing is adopted, if such processing uses new technologies and is likely to result in a high risk to the rights and freedoms of natural persons after taking into account the nature, scope, context and purposes of the processing.

Although this may appear to be a relatively minor development in Hong Kong, we see this as a step in a broader movement toward the regulation of AI and a sea change in the approach to data protection and privacy. The HKMA circular and the EAF are in line with the global data protection law developments, which are largely being led by the EU.

Back

Hannah Cassidy
Hannah Cassidy
Partner, Hong Kong
+852 2101 4133
Jeremy Birch
Jeremy Birch
Partner, Hong Kong
+852 2101 4195
Sheena Loi
Sheena Loi
Senior Consultant, Hong Kong
+852 2101 4146
Peggy Chow
Peggy Chow
Senior Associate TMT/Data Protection, Singapore
+65 6868 8054

Stock Connect Northbound Investor ID Model set to launch on 17 September 2018

Late last month, the Securities and Futures Commission (SFC) announced that agreement had been reached with the China Securities Regulatory Commission (CSRC) to implement the investor identification model for Stock Connect Northbound trading (NB Investor ID Model) on 17 September 2018. This will apply to Northbound trading under both the Shanghai-Hong Kong Stock Connect and the Shenzhen-Hong Kong Stock Connect.  Continue reading

EU General Data Protection Regulation: Practical Steps for Employers

The General Data Protection Regulation ("GDPR") aims to harmonise data protection procedures and enforcement across the European Union. It will apply to all EEA countries and the companies that conduct business in them from 25 May 2018. New standards for consent, enhanced information rights and greater sanctions for data processors and controllers indicate a potentially significant impact for employers; companies should take steps now to prepare for the changes. But what steps should they take in light of the referendum result and the potential UK exit from the European Union?

This briefing from our Employment team focuses on the implications of the GDPR in the employment sphere and the practical steps that employers should take in relation to data protection in relation to recruitment, during employment and on termination of employment.

Financial firms: protecting customer personal data

A recent case provides a rare example of the criminal prosecution of an individual (in this case the former employee of an insurer) for breach of the Data Protection Act 1988 (DPA).

David Barlow Lewis was a former employee of the insurer LV. He offered an ex-colleague £3,000 a month to send him the details of customers involved in road accidents. She refused to do so, and Lewis was subsequently prosecuted at Bournemouth Magistrates’ Court for attempting to commit an offence under section 55 of the Data Protection Act 1998 . He had knowingly or recklessly attempted to obtain personal data without the data controller’s consent.

Continue reading

EU reaches agreement on the General Data Protection Regulation

After almost four years of debate, the European Commission, Parliament and Council have finally reached political agreement on the proposed General Data Protection Regulation (the "GDPR"). The final text of the GDPR will now need to be formally approved by the European Parliament and the Council at the beginning of 2016. There will then be a two year implementation period before the GDPR comes into effect, meaning that organisations should expect the new rules to apply from sometime in 2018.  To read more about the GDPR from our TMT team, click here.