China Cybersecurity and Data Protection: Monthly Update – July 2021 Issue

Key highlights – our comments on the cybersecurity probe into DiDi and the draft of the revised Measures on Cybersecurity Review

In early July, the Cyberspace Administration of China (CAC) announced that it had initiated cybersecurity review on three companies, namely DiDi, Boss Zhipin and Full Truck Alliance, and during the review the three companies are not permitted to register new users in order to “prevent spreading of risks”. In addition, the CAC also orders application stores to remove DiDi’s application due to “serious violations in collecting and using personal information”. Notably, all of the three companies were listed in the United States in June 2021.

There are very few details available to the public about the proposed cybersecurity review except for the fact that it has been initiated. The cybersecurity review is one of measures contemplated under the Cybersecurity Law (CSL) in order to ensure supply chain security of the critical information infrastructure (CII) through a review of the procurement of network product and services that may impact national security. One of the reasons why it had not been invoked till recently is that the scope of CII has not been identified. Although the CSL requires the State Council to publish regulations on the protection of CII, the CAC only released a draft regulation in July 2017. The guidance on identifying CII as contemplated in the draft regulation has never been published. Without knowing whether the information facilities are considered CII, it is almost impossible put the security review and all the other relevant CII protection measures into practice. The State Council seems to have been aware of this, and has included the regulation on CII protection in their legislative agenda for 2019, 2020 and 2021. We hope this regulation will finally be published this year.

In the absence of CII identification guidance, the first question here is how DiDi is identified as an operator of CII. Although it might meet the criteria set out in the general definition of CII under the CSL, we expect that at least a identification procedure should be followed to justify the decision, and it is unclear whether DiDi was aware of the fact that it was considered a CII operator before the decision for cybersecurity review was made.

Another question is which network product or service procured by DiDi has impacted national security. There is no indication in the announcement by the CAC, and it remains to be seen how the CAC will interpret and assess the procurement on national security.

There are also questions on the enforcement measures. The regulation on cybersecurity review does not empower CAC to take any enforcement measures alongside the initiation of the review. In terms of penalties, the CSL only permit the authority to order the CII Operator to cease using the relevant network products or services, and to impose a fine of up to 10 times of the purchase amount on the CII operator and a fine of up to RMB 100,000 on the persons responsible, if the CII operators use the unauthorised products or services. The CSL provision also allow the authorities to require network operators to take technical or other necessary measures to prevent contain harm in the event of a cybersecurity incident. In this case, DiDi has been ordered to stop registering new users, and the CAC may rely on such provision to take the measures, although the announcement does not mention that a cybersecurity has occurred.

As the Data Security Law (DSL) is not enforceable yet, CAC is not able to invoke any measures provided thereunder if there is any allegation concerning DiDi’s data (especially important data) processing activities. The national security review regime proposed under the DSL is even further from becoming enforceable. The CAC does not specify the data protection laws and regulations pursuant to which it ordered the removal of DiDi application from application store. Considering that the Personal Information Protection Law (PIPL) is yet to be enacted, it is likely that the decision is based on the CSL and the relevant regulations on processing of personal information by mobile applications.

As discussed, the factual basis for CAC’s decisions remains unclear. It is worth pointing out that at this point, there is still a very limited number of enforceable laws and regulations in cybersecurity and data protection that the authority can actually rely on for their enforcement actions. The CSL and the cybersecurity review regulation are the most readily available from an enforcement perspective at this point.

CAC seems to have realized the inadequacy of the current regulations. On 10 July 2021, the CAC released a draft of the revised Measures on Cybersecurity Review for public consultation. Notably, the revised draft has extended the scope of cybersecurity review to cover data processing activities of data processors that may impact national security. The extension is apparently intended to implement the national security review on data processing activities as contemplated under the DSL.

The revised draft has special focus on listing of companies outside China that process core data, important data and large amount of personal information. Any operator that processes personal information of over 1 million users must apply to the CAC for cybersecurity review before the operator is listed outside China. The CAC will assess the risks of CII, core data, important data or personal data being “influenced, controlled or maliciously used” by foreign governments if an operator is listed outside of China. The revised draft also includes China Securities Regulatory Commission (CSRC) as one of the ministries responsible for the review.

The revised draft is apparently targeted at companies, such as DiDi, which have launched or going to launch their initial publish offering outside China and at the same time process a large amount of personal information, important data or even core data within China. One critical question is that if the listing of a company is considered to have impacted national security after review what actions the CAC and CSRC will take, e.g. whether the company will be order to delist. For the companies that plan to be listed outside China, the cybersecurity review will bring great uncertainty to their listing process and could potentially affect their decision as to the place of listing.

An interesting point is whether a listing in Hong Kong will be subject to the cybersecurity review. The revised draft uses the term “listing outside China”, instead of the traditional expression of “overseas listing” used in the context of securities laws which usually includes Hong Kong listings. It is unclear whether this indicates that Hong Kong listings are excluded from the scope of review, and CAC should clarify this point in their final draft.

Data processing and cybersecurity compliance are now under closer scrutiny by the government. Although there are still questions surrounding the decisions on Didi, and the revised Measures on Cybersecurity Review is still in a draft, no doubt companies, especially the technology companies, should pay more attention to their compliance with data and cybersecurity laws, in anticipation of the upcoming DSL, PIPL and the implementing regulations. Companies that process important data or sizable amount of personal information or operate CII should particular heed the regulations and actions of CAC.

Our views

If you would like to know more about the cybersecurity review, please click below link to read our previous article on the Measures on Cybersecurity Review published in June 2020.

New regulation strengthens cyber supply chain security in China

If you would like to know more about the newly-enacted Data Security Law, please click below link to read our comments.

What to know about China’s Data Security Law

Regulatory developments

1. Data Security Law promulgated, and will come into effect on 1 September

On 10 June 2021, the Standing Committee of the 13th National People’s Congress voted through the Data Security Law after a third reading, which will become enforceable from 1 September 2021. Compared with the second draft, key changes in the final version are: (i) the commission will establish a coordination mechanism for national data security (Mechanism), and the Mechanism will coordinate relevant ministries to draft the catalogue of important data and strengthen the acquisition, analysis, research and early warning of risk information; (ii) a new concept of “core data” of the state is introduced, which is defined as data relevant to national security, national economic lifeblood, important livelihood of people and significant public interest. Core data will be subject to an even more rigorous protection regime; (iii) personal information processing activities shall comply with Data Security Law as well as relevant laws and regulations

2. Notice on Strengthening the Network Security of Vehicle Networking was released for public consultation

On 22 June, the Ministry of Industry and Information Technology (“MIIT”) issued the Notice on Strengthening the Network Security of Vehicle Networking for public consultation. This Notice consists of four aspects: strengthening the network security protection of the vehicle networking, strengthening the security protection of the platform, ensuring data security, and strengthening security vulnerability management. It aims to guide basic telecommunications enterprises, intelligent connected vehicle operation enterprises, and intelligent connected vehicle production enterprises to strengthen the network security management of the vehicle networking (intelligent connected vehicle), accelerate the improvement of the ability of guaranteeing cybersecurity, and promote the healthy development of the vehicle networking industry.

3. Notice on Strengthening the Management of Name Registration of Vehicle Networking Cards was released for public consultation

On 11 June, MIIT issued the Notice on Strengthening the Management of Name Registration of Vehicle Networking Card for public consultation. According to this Notice, the MIIT is responsible for the organization, management and overall promotion of nationwide name registration of vehicle networking cards. Vehicle enterprises are responsible for the name registration of the vehicle networking cards of vehicles produced and sold by them pursuant to the relevant requirements of the competent authorities. The Vehicle enterprises shall establish strict management systems for the purchase, use and name registration of the vehicle networking cards, build name registration management platforms of vehicle networking cards, and improve the user information protection system. This notice also provides that telecom enterprises should strengthen the management of basic resources of vehicle networking cards.

4. Vehicles Networking Security Standard System Construction Guide was released for public consultation

On 21 June, MIIT issued the Vehicles Networking Security Standard System Construction Guide for public consultation. This Guide points out that efforts should be made to build the cybersecurity standard system of the vehicles networking, so as to provide support for the safe and sustainable development of the vehicles networking industry. By the end of 2023, basic network security standard system of the vehicles networking shall be built, and by 2025, a relatively complete network security standard system of the vehicles networking shall form. This Guide elaborates the construction ideas, construction contents and implementation scheme of the network security standard system of the vehicles networking.

5. “Internet Medical and Health Information Security Management Regulations (Draft for Comments)” was released for public consultation

On 4 June, the Statistic and Information Centre of the National Health Commission issued the “Internet Medical and Health Information Security Management Specification (Draft for Comments)”, an industry standard, for public consultation. It specifies the regulations and security requirements for the overall framework of information security management of Internet medical and health, management of information security related party, management of information security process, management of information security data, management of information security technology, and management of information security organization, and it is applicable to information security management in Internet medical and health activities carried out by organizations and individuals in China.

6. Basic Requirements for Multi-level Protection of Cybersecurity in Radio and Television was released

On 21 June, the Science and Technology Department of the National Radio and Television Administration released the Basic Requirements for Multi-level Protection of Cybersecurity in Radio and Television, which has been reviewed and approved by the National Radio, Film and Television Standardization Technical Committee. Basic Requirements for Multi-level Protection of Cybersecurity in Radio and Television was formulated by the relevant entities organized by the National Radio and Television Administration, which stipulates the general requirements and extended requirements for the security of objects from the first level to the fourth level of classified protection objects in radio and television network security and the requirements for security expansion.

7. Ministries published Opinions on Several Issues Concerning the Application of Law in Handling Criminal Cases Such as Telecommunications and Network Fraud

On 17 June, the Supreme People’s Court, the Supreme People’s Procuratorate, and the Ministry of Public Security issued Opinions on Several Issues Concerning the Application of Law in Handling Criminal Cases Such as Telecommunications and Network Fraud (“Opinions”). The Opinions clearly stipulate the identification of the crime location of telecommunications and network fraud, the situation of trying the cases together, illegal possession of credit cards, the crime of fraud, the crime of infringing on citizens’ personal information, and the crime of forging identity documents, which are conducive to further clarification of legal standards, and severely punishing telecommunication network and fraud crimes in accordance with the law.

Enforcement developments

1. Cyberspace Administration of China reported that 129 apps collected and used personal information illegally

On 11 June, Cyberspace Administration of China (“CAC”) reported 129 apps widely used by the public, including Keep, Joyrun, Xiaomi Sports, Jinri Toutiao, Tencent News, Nike,, etc., covering the field of sports, news information, webcast, app store, and women’s health. These apps collected personal information unrelated to the services, violating the necessity principle and the Regulations on the Scope of Necessary Personal Information for Common Types of Mobile Internet Applications, collect and use personal information without the users’ consent, and do not provide the function of deleting and correcting personal information and the channels for filing complaints.

2. MIIT notified APPs that infringes users’ rights and interests

On June 8, MIIT issued a notification of apps that violate users’ rights and interests (the fifth batch in 2021, the 14th batch in total). Previously, MIIT organized a third-party inspection agency to inspect mobile apps, requiring relevant companies to make rectifications. 291 apps had not finished the rectification until June 8, and there were problems like illegal collection of personal information in these apps. These APPs should complete the rectification and implementation work before June 16. If rectification is not finished within the time limit, MIIT will take disposal measures in accordance with laws and regulations.

3. Four ministries of Beijing jointly carry out the special treatment of App cybersecurity in 2021

On 12 June, the Beijing branch of CAC, Beijing Public Security Bureau, Beijing Market Supervision Bureau, and Beijing Communications Administration issued a notice on the launch of special governance of app cybersecurity in Beijing in 2021, and decided to carry out a special governance action on the illegal collection and use of personal information of apps in the city from the date of issuing the notice to November, requiring app operators to collect and use personal information in accordance with laws and regulations, be responsible for the security of the collected personal information, and take effective measures to strengthen personal information protection. This special action is based on the Cyber Security Law, Notice on Issuing the Methods for Identifying the Collection and Use of Personal Information in Violations of App Laws and Regulations, Notice on Issuing the Regulations on the Scope of Necessary Personal Information for Common Types of Mobile Apps, etc., to provide in-depth control of the illegal collection and use of personal information by App operators.

4. Four ministries of Zhejiang Province jointly launched a special treatment for the illegal collection and use of personal information by App in 2021

On 1 June, Zhejiang Cyberspace Administration Office, Zhejiang Public Security Department, Zhejiang Market Supervision Bureau, and Zhejiang Communications Administration issued an announcement on the jointly launch of a special treatment for illegal collection and use of personal information by apps from June to December in 2021. This special treatment focuses on apps that have a large number of users, are closely related to people’s lives, or are complained by citizens. Relevant departments will carry out specific rectification for apps that have hidden security hazards such as illegal collection and use of personal information and causing personal information leakage.

5. MIIT and the Ministry of Public Security issued a notice on the rectification of fraudulent phone cards, Internet of Things cards, and associated Internet accounts

On 2 June, MIIT and the Ministry of Public Security issued a notice on clearing and rectifying fraudulent phone cards, Internet of Things cards, and associated Internet accounts, requiring people who illegally handle, rent, sell, buy, and hoard of phone cards and Internet of Things cards and relative person of Internet accounts, as of the date of this notice, to stop related activities, and cancel related phone cards, Internet of Things cards, and related Internet accounts before the end of June 2021. Relevant departments will fight against illegal handling, renting, and selling, buying and hoarding phone cards, Internet of Things cards, and associated Internet accounts in accordance with the law.

Industry developments

1. The pilot work for identity authentication and security trust in vehicles networking was launched

On 8 June, the General Office of MIIT issued a notice on launching a pilot program for identity authentication and security trust in vehicles networking. The pilot direction consists of four aspects: vehicle-to-cloud safe communication, vehicle-to-vehicle safe communication, vehicle-to-road safe communication, and vehicle-to-device safety communication. Basic telecommunications companies, Internet companies, automobile manufacturers, electronic parts companies and other entities can apply for pilot projects for Internet of vehicles networking authentication and security trust. MIIT will select projects that fulfil the requirements of carrying out the pilot work. The pilot entities should take the key responsibilities of cybersecurity, improve the corporate cybersecurity management system, and implement cybersecurity protection requirements.

2. E-commerce platform enterprises undertake to strictly implement relevant requirements for spam message governance

On 11 June, the Information and Communications Administration Bureau of the Ministry of Industry and Information Technology held an administrative guidance meeting, at which the Ministry of Industry and Information Technology warned e-commerce platform enterprises to standardize the sending of short messages in marketing and strengthen industry self-discipline. Alibaba, JD, PDD and other major e-commerce platform enterprises have made a solemn commitment to strictly implement the relevant requirements on garbage information control, conduct comprehensive self-inspection and self-correction, improve the management system, optimize user services, ensure the achievement of tangible results in a short time, and constantly enhance the sense of gain, happiness and security of the vast majority of users.

International developments

1. Biden Signed Executive Order on Protecting Americans’ Sensitive Data from Foreign Adversaries

On 9 June, President Biden signed Executive Order on Protecting Americans’ Sensitive Data from Foreign Adversaries, repealing and superseding three executive orders aimed at prohibiting transactions with TikTok, WeChat, and eight other communications and finance-technology software applications. The decision mainly includes the following three aspects: enabling the United States to take strong measures to protect sensitive data of the United States; developing standards for identifying software applications that may pose unacceptable risks, and further developing plans to protect sensitive personal data against potential threats posed by certain connected software applications.

2. EDPB and EDPS issued a Joint Opinion on EU AI Regulation

On 18 June, EDPB and EDPS issued a Joint Opinion on the proposal for a Regulation of the European Parliament and of the Council laying down harmonized rules on artificial intelligence (Artificial Intelligence Act). (Artificial Intelligence Act) In the Joint Opinion, EDPB and EDPS issued a call for a total ban on the use of AI in public places to automatically recognize human features, such as faces, but also on other biological or behavioral signals such as gait, fingerprints, DNA, and voice.

3. Version 2.0 issued by EDPB on “Supplemental Measures” Guide to Standard Conditions of Contract

On 18 June, EDPB updated the Supplemental Measures Guide to the standard contractual clauses for the transfer of personal data to third countries pursuant to Regulation published on June 4 as a follow-up to the Schrems II decision of the European Court of Justice. It provides a number of steps to be followed, potential sources of information, and examples of supplemental measures that could be implemented to assist senders in the complex task of assessing third countries.

4. Recognition by the EU of the UK Data Protection Rules

On 28 June, the EU recognized that Britain’s privacy rules are commensurate with EU rules, a key step that will allow the flow of data between the EU and the UK to continue after Brexit. Meanwhile, the EU added a “sunset clause” to set a four-year term for the decision. If during this period, the UK has major differences with the EU on data standards, the European Commission may intervene.

5. HiQ’s Grabbing of LinkedIn User Information Case Requested A New Trial

On 14 June, the US Supreme Court asked the lower courts to review the case regarding the grabbing of LinkedIn User Information by hiQ. An earlier decision held that LinkedIn should not prohibit competitor hiQ Labs from collecting personal information from LinkedIn users’ public personal information. LinkedIn believes that the use of “robots” for large-scale grabbing of personal information would pose a serious threat to user privacy. Rival hiQ Labs argues that it does not sell user information and that LinkedIn’s lawsuit is aimed at monopolizing public data, hurting the openness and innovation of the Internet. Although hiQ Labs does not sell user information captured, for LinkedIn, the “user privacy risk” associated with data being captured by various crawler tools does exist. In April, it was reported that the archive of data captured from 500 million LinkedIn resumes was sold on a hacker forum.

6. Google plans to remove the third cookie by the end of 2023

On 25 June (Thursday, local US time), Google announced that its Chrome Internet browser would stop supporting a user-tracking technology called third-party cookies by the end of 2023, nearly two years after its original time frame for early 2022. Under pressure from privacy regulators and advocates, Google had previously announced that it would remove cookies, which many companies in the advertising industry use to track individuals and target ads. Google said the delay would give publishers, the advertising industry and regulators more time to familiarize themselves with the new technology it is developing and testing to continue to target ads after cookies exit.

Mark Robinson
Mark Robinson
Partner, Singapore
+65 6868 9808
Nanda Lau
Nanda Lau
Partner, Mainland China
+86 21 2322 2117
James Gong
James Gong
Of Counsel, Mainland China
+86 10 6535 5106

European Commission publishes final Article 28 clauses

Simultaneous with the European Commission publishing its final standard contractual clauses for the international transfer of personal data (see our blog post here for further information) (the “New SCCs“), they have now published a final set of standalone Article 28 clauses for use between controllers and processors in the EU, also termed ‘standard contractual clauses’ (the “Final Article 28 Clauses“) (available here). Continue reading



On 28 May 2021, the Information Commissioner’s Office (“ICO“) published a call for views on the first draft chapter of its anonymisation, pseudonymisation and privacy enhancing technologies draft guidance). This first chapter is part of a series of chapters of guidance that the ICO will be publishing on anonymisation and pseudonymisation and their role in enabling safe and lawful data sharing. Addressed to organisations seeking to anonymise personal data, it seeks to define anonymisation and pseudonymisation and provides some practical advice to such organisations on how to manage their obligations.

The guidance supplements the ICO’s Data Sharing Code of Practice (the “Code“), which we discussed in our blog post here. The Code contained guidance on the aspects organisations need to consider while sharing personal data. While the Code briefly touched upon anonymisation and pseudonymisation, it did not address some of the key issues that arise time and again when organisations seek to anonymise and pseudonymise data. This new series of guidance, with its specific focus on anonymisation and pseudonymisation, will hopefully address these issues.

In this blog post, we discuss our key takeaways from the first chapter of the guidance and the impact that it is likely to have.

What is Anonymisation?

The first chapter of the guidance explains that anonymous information is data which does not relate to an identified or identifiable individual. Data protection law does not apply to truly anonymous information. According to the guidance, anonymisation is the way in which personal data is turned into anonymous information, and includes the techniques and approaches which can be used to this end.

The guidance also clarifies that it is not necessary that anonymisation be free of risks, and emphasises that the risk of re-identification should be mitigated to the extent that it becomes sufficiently remote and that information becomes ‘effectively anonymised’. In this respect, guidance states that anonymisation is ‘effective’ when: (a) it does not relate to an identified or identifiable individual; or (b) is rendered anonymous in such a way that individuals are not (or are no longer) identifiable.

Importantly, the guidance makes the important clarification that applying anonymisation techniques to render personal data anonymous is considered a processing activity in of itself, and data protection requirements have to be adhered to while undertaking such processing, which includes informing data subjects that this is to take place.

What is Pseudonymisation?

The first chapter of the guidance confirms that pseudonymisation is a method used to remove or replace information that identifies an individual, for example, by replacing names or other identifiers with codes or numbers. However, the guidance cautions that organisations must take care to maintain the additional information (i.e. the identifiers) separately and protect it using appropriate technical and organisational measures, as individuals can be identified by reference to this additional information.

Crucially, the guidance seeks to address one of the long-debated questions surrounding pseudonymised data – can pseudonymised data be considered anonymised data in the hands of a third party who has no means to re-identify that data?

In this respect, the guidance clarifies that when transferring the pseudonymous data to a third party, an organisation needs to consider the context and circumstances of the transfer – if the third party has no means which are reasonably likely to re-identify the individuals in the transferred dataset, the dataset may be considered ‘anonymous information’ in the hands of the third party. However, should the transferring organisation still have access to the additional information which can identify individuals, the dataset will continue to be personal data in that organisation’s hands. Whilst many organisations have been operating under the assumption that pseudonymised data can be considered anonymised data in the hands of a recipient without the means re-identify that data, this is a welcome and important clarification.

Accordingly, both disclosing and recipient organisations will need to carefully consider whether the data is anonymous or pseudonymous in their hands, to consider their data protection obligations.

The guidance also sets out that pseudonymous data is still personal data and data protection law applies to such data. However, it does not specify if there is any degree of difference in how data protection law will apply to conventional personal data and pseudonymous data. We expect that the ICO will address this issue in the remaining chapters of its anonymisation and pseudonymisation guidance. It remains to be seen what the obligations of a recipient third party will be in context of a pseudonymised dataset it receives, when it does not have the additional information which can re-identify individuals from that dataset.

The Way Forward

The ICO will be publishing further chapters of its anonymisation and psuedonymisation guidance on identifiability, pseudonymisation techniques, accountability and governance requirements, amongst other topics. These upcoming chapters will hopefully provide further guidance and clarity on the obligations of organisations while sharing pseudonymised data and best practice to be followed in order to ensure compliance with data protection requirements.

Duc Tran
Duc Tran
Of Counsel, Data Protection and Privacy, London
+44 20 7466 2954
Ananya Bajpai
Ananya Bajpai
Trainee Solicitor, London
+44 20 7466 2952

China Cybersecurity and Data Protection: Monthly Update – May 2021 Issue

This e-bulletin summarises the latest developments in cybersecurity and data protection in China with a focus on the regulatory, enforcement, industry and international developments in this area.

Our highlights

In late April, we saw the second reading of the proposed Personal Information Protection Law (PIPL) and Data Security Law (DSL) by the Standing Committee of the National People’s Congress, which marks a step closer to the enactment of these two milestone legislations. We have prepared an e-bulletin on the key changes in the second draft. Please click below link for further reading.

Our views

Key changes in the second draft of Personal Information Protection Law and Data Security Law

Regulatory developments

1. PIPL was released for public consultation

On 29 April 2021, the China National People’s Congress released PIPL for public consultation. The key changes of the Personal Information Protection Law (Second Review Draft) are as follows. The first is the inclusion of an additional legal basis for processing personal data which is if it is “within a reasonable scope in accordance with the provisions of this law”. Next, the revised draft PIPL stipulates that personal information processors shall provide individuals with convenient methods to withdraw their consent and the withdrawal of their consent shall not affect the effectiveness of personal information processing activities that have been carried out before this withdrawal. Further, there are new regulations in relation to the obligations of large Internet platforms for protecting personal information. The draft PIPL also consolidate the provisions in relation to cross-border transfer of personal information which shall be for either business needs or for judicial or law enforcement authorities. In addition, the revised PIPL also includes new provisions on the protection of the personal information of deceased individuals. Finally, the revised PIPL has clarified the burden of proof from the individual to the data processor in civil personal infringement cases

2. DSL was released for public consultation

On 29 April 2021, the China National People’s Congress released the DSL for public consultation. The DSL includes revised regulations that the state will establish a data classification and grading protection system and determine important data catalogues to strengthen the protection of important data. Further, all regions and ministries will decide specific catalogues of important data within their own regions, departments, and related industries and fields in accordance with relevant regulations. Among others, the DSL states that to carry out data processing activities, it is necessary to establish and improve the security management system which is based on a network multi-level protection scheme to aid in the strengthening of data security protection. The export and security management measures of important data collected and generated by critical information infrastructure operators shall be subjected to the Cybersecurity Law. On the other hand, the the export and security management measures of important data collected and generated by other data processors will be formulated by the National Cyberspace Administration of China in conjunction with relevant ministries of the State Council. For cross-border transfers of data, data processors shall only transfer such local data to a foreign judicial and law enforcement body upon approval of the competent authority, failure to obtain approvals shall result in penalties for the data processors.

3. The national standard of Safety Requirements for Collecting Data of Connected Vehicles was released for public consultation

On 28 April 2021, the Secretariat of the National Information Security Standardization Technical Committee released the national standard of Safety Requirements for Collecting Data of Connected Vehicles for public consultation. In terms of data transmission, except for specific data, without the consent of the person whose data is collected, connected vehicles shall not transmit data containing personal information to outside of the vehicle through network or physical interface. Connected vehicles shall not transmit the audio, video, image and other data collected in the car cabin and the data processed by them to the outside of the vehicles through the network or physical interface. In terms of data storage, the vehicle location and trajectory-related data collected by the connected vehicles shall not be stored for more than 7 days in the in-vehicle storage device and the telematics service platform (TSP). In terms of exporting data, the data of road, building, terrain, traffic participants and other data collected from the connected vehicles’ external environment through cameras, radars and other sensors, as well as the data related to vehicle location and trajectory, are not allowed to be transferred out of the country. If data such as connected vehicles’ driving status parameters and abnormal warning information need to be exported, the exporter shall comply with relevant national regulations. If a connected vehicle transmits data overseas through encryption, it should provide information such as the data format and encryption method of the transmission, and provide relevant data content as required when the regulatory authority conducts spot checks and verification.

4. The People’s Bank of China issued the Financial Data Security: Data Life Cycle Security Specification

On 8 April 2021, the People’s Bank of China issued the Financial Data Security Data: Life Cycle Security Specification (the Specification), which stipulates the security principles of financial data life cycle, requirements of protection, organizational guarantee, information system operation and maintenance guarantee. The Specification establishes a security framework covering the process of data collection, transmission, storage, use, deletion and destruction, and stipulates that the internal transmission of data that falls at and above a “level three” category shall adopt data encryption, secure transmission channels or secure transmission protocols for data transmission. In principle, data that falls at and above a “level three” category should not be transmitted outside the financial institutions. If the transmission is really necessary, it should be approved and authorized in advance, and technical measures should be taken to ensure confidentiality. The Specification is to provide guidance to financial institutions to carry out electronic data security protection work, and provide a reference for third-party evaluation institutions to carry out data security inspection and evaluation.

5. The Ministry of Industry and Information Technology issued the Interim Provisions on the Protection and Management of Personal Information in Mobile Internet Applications (Draft for Public Consultation)

On 26 April 2021, the Ministry of Industry and Information Technology (MIIT) issued the “Interim Provisions on the Management of Personal Information Protection and Management of Mobile Internet Applications (Draft for Public Consultation)” (the Provisions) for public consultation. The deadline to provide comments was on 26 May 2021. The Provisions aims to strengthen the protection of personal information in mobile Internet applications (APP), regulate personal information processing activities in APPs, and promote the reasonable use of personal information. Personal information processing activities from APPs carried out within the territory of the People’s Republic of China shall comply with these regulations. The key highlights of the Provisions are: (1) defines the scope of application and the subject of supervision, (2)establishes the two important principles of “informed consent” and “minimal necessity”, (3) details main responsibilities and obligations that APP development operators, distribution platforms, third-party service providers, terminal enterprises and network access service providers shall perform in the personal information processing activities in APPs and (4) puts forward four requirements for APPs including complaints and reports, supervision and inspection, handling measures and risk warnings.

6. Administrative Measures for Online Live Marketing (on Trial)

On 23 April 2021, the seven national ministries jointly issued the Administrative Measures for Webcast Marketing (on Trial) (the Measures) which came into force on 25 May 2021. The Measures requires that live marketing platforms establish and improve mechanisms and measures on the following: (1) account and live marketing function registration and cancellation, (2) information security management, (3) marketing behaviour regulations, (4) minor protection, (5) consumer rights protection, (6) personal information protection, (7) network and data security management, etc. The live marketing platforms shall take necessary measures to ensure the security of the personal information processed by them. The live marketing platforms shall consolidate the information security management of the links in the live, QR code and other jump services to prevent information security risks.

7. The National Healthcare Security Administration issued the Notice on Issuing Guidance Opinions on Strengthening Cyber Security and Data Protection

On 9 April 2021, the National Healthcare Security Administration issued the Notice on Issuing Guiding Opinions on Strengthening Network Security and Data Protection (the Notice), which clearly defines the guiding ideology for strengthening network security and data protection. The Notice puts forward six major measures to strengthen network security management. These include: (1) introducing the main responsibilities of entities in charge of network security, (2) improving the network security supervision and management mechanism, (3) strengthening the security protection of critical information infrastructure, (4) consolidating the capability of protecting the network security by technology, (5) improving capabilities of network security situational awareness, early warning and coordination and (6) improving emergency response capabilities for emergent network security incidents. In terms of the strengthening of data protection, these measures are (A) implementing security management throughout the life cycle of data (B) implementing hierarchical and classified management, (3) strengthening the protection of important data and sensitive fields, (4) strengthening data security approval management, (5) implementing the jurisdiction of data security, (6) promoting safe sharing and use of data; and (7) establishing a sound data security risk assessment mechanism.

8. The Ministry of Transport issued the Administrative Measures for the Sharing of Transport Data in Government Affaires

On 6 April 2021, the Ministry of Transport issued the Administrative Measures for the Sharing of Transport Data in Government Affairs (the Administrative Measures) to standardize the sharing of transportation data in government affairs. Transportation government data is defined as various non-confidential data, documents, materials, charts, etc. that are collected, generated, obtained, recorded and preserved in electronic form in accordance with the law, directly or through a third party, by government departments in the course of performing their duties. The Administrative Measures has six chapters and 26 articles, and contains the scope of application, the sharing management system and division of responsibilities of government transport data, requirements and procedures of compiling, releasing, updating and managing of catalogues as well as the methods of provision and acquisition of government data.

9. The National Information Security Standardization Technical Committee released ten national standards and information security technology drafts for public consultation

In April 2021, the National Information Security Standardization Technical Committee issued the “Information Security Technology: Security Evaluation Specification for Mobile Internet Application (APP) Personal Information”, “Information Security Technology: SDK Security Guide for Mobile Internet Application (APP)”, “Information Security Technology: Technical Specifications for Government Network Security Monitoring Platform”, “Information Security Technology: Evaluation Requirements for Information System Password Application”, “Information Security Technology: Evaluation Specification for Personal Information De-identification Effect Classification”, “Information Security Technology: Technical Requirements for Edge Computing Security “, “Information Security Technology: Basic requirements and Guidelines for IPSec VPN Security Access”, “Information Security Technology: Security Requirements for Voiceprint Recognition Data”, “Information Security Technology: Security Requirements for Gait Recognition Data” and “Information Security Technology: Security Requirements for Face Recognition Data.” These ten drafts provide guidance and reference for third-party evaluation agencies, competent regulatory authorities and related operators in related fields.

Enforcement developments

1. The Ministry of Industry and Information Technology issued a notice of APPs that infringed users’ rights and interests (No.4th batch of 2021, Sum. the 13th batch)

On 23 April 2021, the MIIT issued a notice of APPs that infringed users’ rights and interests. The MIIT authorised a third-party testing agency to inspect mobile phone application software, and focused on urging games and tools companies with problems to make rectifications. To date, 93 APPs have not completed the rectifications exercise. In the first quarter of 2021, there were issues found in Tencent App Store, Mi App Store, OPPO App Store, Huawei App Store, and Vivo App Store. These issues included laxed shelf reviews, the inventory issues thoroughly cleaned, and the registrations and verifications of the information of APPs’ developers and operators were inaccurate, which misled users to mistakenly download the APP. The MIIT has urged relevant platform companies to carry out comprehensive rectification and strictly monitor compliance.

2. The Ministry of Industry and Information Technology removed 60 APPs

On 6 April 2021, the MIIT notified 60 apps that had not yet completed rectification in accordance with the Cybersecurity Law and the Interim Provisions on the Management of Pre-setting and Distribution of Mobile Smart Terminal Application Software (MIIT Xinguan [2016] No. 407) and other laws and normative documents. Pursuant to the authority provided under these laws and regulations, the MIIT removed those APPs for failure to rectify its issues. Further, the relevant application stores shall immediately remove the infringing application software in their stores upon publication by MIIT of the notice.

3. Zhejiang Communications Administration issued a notice of APPs that infringed users’ rights and interests (the third batch in 2021)

On 16 April 2021, Zhejiang Communications Administration issued a notice of APPs that infringed users’ rights and interests. For the51 APPs that had not completed the rectification measures, these will need to be completed before 25 April. If the rectification measures were not completed within the time limit, the Zhejiang Communications Administration will deal with the APPs in accordance with laws and regulations. Most of the issues involved in the APPs related to failure to express the purpose, method and scope of the collection and use of personal information, the collection of personal information beyond the scope, mandatory, frequent, and excessive requests for permissions, etc.

4. Three real estate companies were fined RMB 250,000 for using the face recognition system

From 14th to 19th April 2021, Ningbo Market Supervision Administration imposed a fine of RMB 250,000 on three real estate companies (the Parties). The Parties had installed face recognition systems of varying brands at their sales offices. The distributors would report the information of customers introduced by distributors to the parties in advance, and the reported information would be uploaded to the face capture system. The system would automatically store the facial biometric information of all customers who visited the sales office. The manner in which the Parties utilised the face recognition system were as follows: (1) when a customer introduced by a distributor officially signs a sale purchase agreement , the Parties collected the customer’s facial biometric information and ID card information through the face authentication machine, and the system automatically gathered the previously reported information related to the customer and the facial biometric information collected by the visiting sales office under the customer’s name and (2) if the time of the customer’s first visit to the sales office corresponded to the time reported by the distributor, the Parties settled the commission to the distributor accordingly.

5. The judgement of the second instance in the first case of face recognition

On 9 April 2021, the Hangzhou Intermediate People’s Court issued the judgement of the second instance of the service contract dispute between Guo Bing and Hangzhou Wildlife World Co., Ltd (Wildlife World). Further to the original judgment, the second judgement included a judgment to Wildlife World to delete the fingerprint identification information submitted by Guo Bing when he applied for the annual fingerprint card. The background of this case was that on 27 April 2019, Guo Bing had purchased the Wildlife World Double Annual Card and left relevant personal identification information together with entering fingerprints and taking photos. Later, Wildlife World adjusted the entry method by annual cards from fingerprint recognition to face recognition, and sent a text message to Guo Bing to notify him of the change, requesting activation of the face recognition method. The negotiation between the two parties failed, which led to a dispute in this case. The Hangzhou Intermediate Court found that Wildlife World wanted to activate and process the photos it had collected into face recognition information, which exceeded the purpose of prior collection and violated the principle of legitimacy. Therefore, the facial feature information including the photos submitted by Guo Bing when applying for the card should be deleted. Further, inview of the fact that Wildlife World stopped using fingerprint recognition gates, which made it impossible to use the originally agreed method for entering the park, Guo Bing’s fingerprint recognition information should also be deleted.

6. The Supreme People’s Procuratorate released 11 typical cases of public interest litigation on the protection of personal information by procuratorial organs

On 22 April 2021, the Supreme People’s Procuratorate issued a typical case of public interest litigation on the protection of personal information by procuratorial organs. Among the 11 typical cases issued by the Supreme People’s Procuratorate, administrative public interest litigation cases were in relation to supervision of personal information and the disclosure of government information of administrative organs in the fields of education, market supervision, public security, cyberspace, agriculture and countryside, and involved personal information leakage in express delivery, medical institutions, off-campus training institutions, etc. Civil public interest litigation cases were in relation to Internet companies’ illegal collection of personal information, and consumption fraud. Public interest civil cases collateral to criminal proceedings were related to the illegal acquisition and transaction of personal information through different means such as technical software, property services, etc. In addition to fighting against criminal acts that infringe on citizens’ personal information, the procuratorial organ also filed a claim against the network operators as co-defendants and demanded these co-defendants to bear the responsibility for public welfare damage.

7. The first case in our country of a telecom operator’s refusal to perform cyber security obligations was pronounced

On 26 April 2021, Xinhuanet released news on the country’s first case of a telecom operator’s refusal to perform cyber security obligations. The virtual operator Yuante (Beijing) Communication Technology Co., Ltd. (Yuante Company), knowing that Ya Feida Company illegally sold a large number of phone cards and used phone cards to engage in illegal and criminal activities, still provided them with a large number of phone cards, and did not adhere to the regulation requirements of setting up high-level authorities. It provided convenience for various illegal and criminal activities, and was suspected of refusing to perform the obligation of information cyber security management. Its chairman and some senior executives were sentenced to fixed-term imprisonment or detention ranging from one year and four months to one year and ten months by the court of first instance. This is the first national case in which a telecommunications operator in our country has been sentenced for inadequate supervision of the real-name system of mobile phone cards, resulting in serious consequences for telecommunications network fraud.

Industry developments

1. Meituan and other platforms participated in the special survey of “Taking Advantage of Users Using Acquired Big Data” to report usage and management of users’ data

On 8 April 2021, the Guangzhou Municipal Market Supervision Bureau of Guangdong Province, in conjunction with the Guangzhou Municipal Commerce Bureau, held a special survey called “Taking Advantage of Users Using Acquired Big Data” and an administrative guidance committee for regulating fair competition market order. 10 Internet platform companies including Vipshop,, Meituan,, Missfresh, Fresh Hema, Ctrip, Qunar, ON TIME, Didi reported data usage and management of users, and put forward suggestions on the supervision of data use. The representatives of the platform companies signed the Platform Enterprises’ Commitment to Maintain a Fair Competitive Market Order, and made promises to the society not to illegally collect and use personal information of customers, and not to take advantage of users by using acquired big data.

2. Tencent released “white paper 2021 on Tencent privacy computing”

On 21 April 2021, Tencent released “Tencent privacy computing white paper 2021”, which describes the basic concept, technical system, roles and drawbacks of privacy computing in data security and compliance. Privacy computing is a kind of technology and system that is jointly calculated by two or more participants. The participants cooperate to perform joint machine learning and joint analysis on their data without disclosing their own data. Privacy computing application is conducive to the protection of personal information security, and helps enterprises fulfill their data protection obligations in the process of data cooperation. Privacy computing is expected to become a technical tool for data compliance and privacy protection in the process of data collaboration, but it still needs to clarify the user authorization mechanism and pay attention to data security risks.

3. Internet platform companies disclose the Promises to Operate in Compliance with Laws and Regulations to the public

On 13 April 2021, the State Administration for Market Regulation, together with the Cyberspace Administration of China, and the State Taxation Administration held an administrative guidance meeting with Internet platform companies. In response to the prominent problems in the field of platform economy, such as the forced implementation of “piking one from two” and other outstanding issues, the meeting put forward “five strict preventions” and “five guarantees”, which clearly imposes requirements for all Internet platform companies to conduct comprehensive self-inspection and self-examination within one month and complete rectification. From April 14th to 16th, the State Administration for Market Regulation announced Promises to Operate in Compliance with Laws and Regulations of Internet platform companies that participated in the meeting. The promises of Internet platform companies include the collection and use of personal information in accordance with the law, the protection of personal information security, and strengthening the reviews of advertising information.

International developments

1. was fined 475,000 euros for delay in reporting data breach

On 2 April 2021, the Dutch Data Protection Agency (DPA) imposed a fine of 475,000 Ruros on because it violated the GDPR’s requirement to report data breaches within 72 hours. In December 2018, criminals obtained personal information such as the names, phone numbers, and addresses of more than 4,000 people who booked hotel rooms from, as well as information on more than 300 credit cards. was informed of the data breach on 13 January 2019, but did not report it to the DPA until 7 February, which was 22 days too late.

2. Apple’s app tracking transparency framework officially came into effect

On 26 April 2021, Apple’s application tracking transparency framework came into effect. Applications shall ask for the user’s permission in order to track users or access their device’s Identifier for Advertisers (IDFA). When an app wants to follow users’ activities to share information with third parties such as advertisers, a window will appear on our Apple device to ask for users’ permission to do so. If users say no, the app shall stop monitoring and sharing users’ data information. The use of the application tracking transparency mechanism has been questioned. The opposing view is that it may harm the interests of other companies, especially advertising companies. Developers and advertising technology companies may also track users through other techniques.

3. The European Commission released Proposal for a Regulation laying down harmonised rules on artificial intelligence

On 21 April 2021, the European Commission issued the Proposal for a Regulation laying down harmonised rules on artificial intelligence (Proposal). This first of its kind legal framework on AI will guarantee the safety and fundamental rights of people and businesses. The Proposal divides the AI systems into four categories of risk: (1) unacceptable risk, (2) high risk, (3) limited risk, and (4) minimum risk. Among them, AI systems intended to be used for the ‘real-time’ and ‘post’ remote biometric identification of natural persons are deemed as high-risk systems. AI systems with high risk levels will be subject to strict obligations before they can be put on the market, including establishing adequate risk assessment systems and providing detailed documentation containing all information necessary on the system and its purpose for authorities to assess its compliance.

Mark Robinson
Mark Robinson
Partner, Singapore
+65 6868 9808
Nanda Lau
Nanda Lau
Partner, Mainland China
+86 21 2322 2117
James Gong
James Gong
Of Counsel, Mainland China
+86 10 6535 5106



On 18 May 2021, the Department for Digital, Culture, Media and Sport (“DCMS“) released the government’s response (the “Response“) to the consultation on the National Data Strategy (the “Strategy“). The Strategy was released in September 2020 as an attempt to pave the way for ‘unlocking the value’ of data across the economy.

The Strategy has five missions:

  • Unlocking the value of data across the economy;
  • Maintaining a pro-growth and trusted data regime;
  • Transforming the government’s use of data;
  • Ensuring the security and resilience of data infrastructure; and
  • Championing the international flow of data.

The Strategy is centred around improving data use and availability across the economy in order to enhance innovation and growth. Opening up government datasets is also a key priority and encouraging the free flow of data internationally has been identified as an important objective.

The Response takes stock of the stakeholder responses to the consultation and addresses the steps that the government has already taken to achieve the above missions, and the way forward in implementing the Strategy.

Below, we have outlined the key themes and issues arising from the Strategy and the Response.

Data in scope: Implications for personal data

The Strategy limits its application to digital information about people, things, and systems. This includes personal data, biometrics, demographics, systems and infrastructure data, geospatial data and sensor data relating to the Internet of Things. As a result, the Strategy covers both personal and non-personal data, and the government will have to tread a fine line while developing regulation to enable data use and access given that personal data is subject to much higher standards of protection. The challenges of sharing personal data have been recognised by the government in the Response, which has endeavoured to maintain these high standards of data protection while ensuring that ‘unnecessary barriers’ are not created to responsible data use (although it is not entirely clear as yet how this balancing act will be achieved). It is also worth noting that the Information Commissioner’s Office (“ICO“) has recently released a new Data Sharing Code of Practice which provides helpful guidance to organisations sharing personal data. (See HSF blog here.)

Data is the new sunlight: Encouraging widespread data use

The Strategy emphasises the increasingly common understanding that data is a resource to be harnessed as opposed to a threat to be managed. The government is seeking to optimise the opportunities that arise from data use to power innovation and better service delivery, highlighting that smaller organisations do not have the same access to data as larger technology companies, which potentially limits their ability to innovate and participate in the market. This rationale for data sharing can also be found in the European Data Strategy, released in February 2020, which stated that data should be available to all, be it start-ups or giants. India has also been considering regulation on the sharing of non-personal data on the basis that it would improve innovation.

The government’s vision to share data is not limited only to sharing in the private sector, but also emphasises the importance of the public sector having access to data to improve decision-making and services, for example, infrastructure and housing.

In improving data access, the government seeks to take an evidence-based approach while striking a careful balance in terms of the degree of government intervention, recognising that intellectual property rights sometimes vest in data, and these ought to be protected.

In Europe, the issue of data access has been addressed in the report of the European Commission titled ‘Competition policy in the digital era‘ which looks at the question of when data access is indispensable for a business. This often hinges on whether access to that data is essential for the business to compete. The Strategy and the Response do not look at data access through this kind of competition law lens, however, the Response does state that the government is looking into the importance of data access in enabling market competition and delivering public benefit. The Digital Markets Unit may become a key player in this sphere, as the Response indicates that powers to promote competition and address market power will be devolved to it. (See HSF Digital Regulation Timeline entry on the Digital Markets Unit here.)

In response to the Strategy, respondents were in agreement about the benefits of better data availability but had divergent views on the degree of government intervention required to achieve it.  As part of its efforts, the government has been conducting research into the measures to counter the barriers to data availability, such as improving the understanding of data sharing, supporting data foundations (i.e. data that is up to date, recorded in standardised formats, easily accessible and retrievable and protected against unauthorised use), improving incentives for and tackling the risks associated with data sharing, and mandating data sharing in the public interest. The last of these measures may sound alarm bells for organisations concerned about protecting their competitive advantage – data is a valuable resource that organisations invest in maintaining as better data leads to better insights. The DCMS report on Increasing Access to Data Across the Economy (from which the measures to improve data availability are sourced) suggests that mandatory data sharing may however be required where the goal is to increase competition.

It will be interesting to see if and how the government mandates data sharing and how this will be balanced with protecting organisations’ intellectual property rights, which has been a stated objective of the Strategy.

International flows of data: A post-EU outlook

Emphasising the UK’s intention to be a world leader in data flows, the Strategy announced the UK’s ambitions to encourage greater flows of data internationally, and ensure that there are no unnecessary constraints caused due to fragmented national regimes. To this end, the Strategy had the following objectives:

  • Securing positive adequacy decisions from the EU to maintain free flows of personal data from the European Economic Area. In February 2021, the European Commission published draft adequacy decisions for transfer of personal data to the UK. This is an important step forward in the UK’s mission to improving international data flows and the UK has now urged the EU to complete the process for adopting and formalising these decisions (although this process could possibly be delayed after the EU Parliament voted in May 2021 to ask the European Commission to modify its draft decisions on whether or not UK data protection is adequate).
  • Developing the UK government’s capability to conduct its own data adequacy decisions. In its Response, the government has stated that it will announce its priority countries for data adequacy shortly. Respondents suggested that the US and EU should be prioritised for UK adequacy assessments, and highlighted opportunities for the UK in the Middle East, Africa, the Indian subcontinent and Brazil. The government will also explore alternative transfer mechanisms to provide some flexibility and we could see UK standard contractual clauses being developed and new binding corporate rules being approved. New guidance on international transfers may also be published by the ICO. This would all be in stark contrast to the EU approach, pursuant to which only 12 adequacy decisions have been issued since the Data Protection Directive (the predecessor to the General Data Protection Regulation) in 1995.
  • Agreeing ambitious data provisions in trade agreements. The UK intends to use its new independent seat in the World Trade Organisation to influence trade rules on data, and also agree provisions in trade agreements which prevent unjustified data localisation measures and maintain high data protection standards. In the Response, the government announced that it had agreed data flow provisions in trade agreements with the EU and Japan to this end. The government has also secured reciprocal free flows of personal data with the non-EU countries that are recognised by the UK as adequate, such as Japan, Canada, Israel and the Crown Dependencies.
  • Driving UK values globally. The Response sets out the government’s intention to “champion the secure, trusted and interoperable exchange of data across borders” and using diplomacy to influence the global position on rules and standards relating to data.

The UK’s opposition to data localisation measures is consistent with the EU position captured in the Regulation on a framework for the free flow of non-personal data in the European Union, which prohibits data localisation measures and enables processing of data in multiple locations throughout the EU. However, some of the countries (such as India and the United Arab Emirates) suggested by respondents as possible priority countries for data adequacy do still incorporate data localisation requirements for certain categories of data. The government’s approach towards the priority countries will be one to watch and data localisation provisions (or the lack thereof) in trade agreements could possibly be hotly contested between parties.

Security and Resilience of Infrastructure: The key to data availability?

In the Strategy, the government sets out the importance of secure and resilient data infrastructure (i.e. systems and services that store, transfer and process data, for example, data centres and cloud computing) characterising it as a “vital national asset” and crystallising its intention to ensure data in transit and data stored in external data centres is sufficiently protected.

As part of its bid to improve security, the Response discussed the National Security and Investment Act (the “NSI Act“) which creates a mandatory notification regime for acquisitions in certain sectors, one of which is data infrastructure. (See HSF blog here).The NSI Act recently received Royal Assent. The operation of the NSI Act is likely to give the government greater oversight over the players in the data infrastructure space and whether acquisitions in the sector are likely to give rise to national security concerns. However, it remains to be seen whether it will create any barriers for investment and innovation in the sector.

In any event, the government’s focus on security of data infrastructure is likely to give its trade partners (as discussed above) some comfort while agreeing to the free flow of data.

Importantly, the government has also flagged the environmental impact of data use, stating in the Response that it will embed sustainability as a key decision point while designing and approving government-owned digital systems and services and use its COP26 presidency to begin an international conversation on the role of data and digital in countering climate change.

Improving the government’s use of data

The experience of managing the pandemic has been a helpful indicator of the usefulness of data sharing between different parts of government in developing a crisis response. Accordingly, another key priority of the Strategy is to improve the way the government uses data across the board. To this end, the government seeks to improve the quality, access and interoperability of data by prioritising the use of the Digital Economy Act (which contains provisions for data sharing between government departments) and the work of the Data Standards Authority on standards for data access. For the latter objective, the government will assess how the FAIR Principles (findable, accessible, interoperable and reusable) can support data management and stewardship, and the TRUST Principles (transparency, responsibility, user focus, sustainability and technology) can be applied to digital repositories.

In the Response, the government has also pledged to increase transparency in algorithmic decision making in government and embed the Data Ethics Framework across government processes.


The government has identified several key priorities in the Strategy and the subsequent Response. Some of these are likely to have a positive impact on the data economy, such as the emphasis on free flows of data internationally, countering the environmental impact of increased data use and improving government use of data. However, it remains to be seen how the government will maintain high data protection standards in the face of widespread data use. Organisations should also keep an eye out for new policy frameworks from the government on mandatory data sharing.

Miriam Everett
Miriam Everett
Partner, Head of Data Protection and Privacy, London
+44 20 7466 2378
Ananya Bajpai
Ananya Bajpai
Trainee Solicitor
+44 20 7466 2952


Executive Summary

  • On 17 December 2020, the Information Commissioner’s Office (“ICO”) published a new Data Sharing Code of Practice (the “Code”). As nearly ten years have passed since the implementation of the previous data sharing code published by the ICO, the new Code has been updated to reflect key changes in data protection laws and the ways in which organisations share and use personal data.
  • The Code was then laid before the Parliament on 18 May 2021 and will come into force after 40 sitting days.
  • The Code serves to compile all of the practical considerations that organisations need to take into account when sharing personal data with other parties, bringing together existing items of ICO guidance (e.g. in relation to ensuring a legal basis has been satisfied) and supplementing this with new guidance (e.g. in relation to data sharing issues that arise when conducting due diligence in M&A transactions).
  • Whilst the Code makes reference to data sharing in the context of new technologies and concepts that were not in existence at the time of the previous data sharing code (e.g. automated decision-making), the Code does not address certain perennial issues in detail, such as the distinction between anonymisation and pseudonymisation and the impact on data sharing.
  • Even though the Code does not represent a huge departure from the previous data sharing code and has been somewhat lost in amongst the glut of guidance that has been released by the ICO and EDPB in recent months, it is still a largely helpful piece of guidance that organisations should carefully consider to ensure that they are adhering to its recommendations.

In this article we pull out aspects of the Code that we deem to be noteworthy, such as the practical steps that organisations need to consider when sharing personal data with other parties and our views in relation to whether the Code is fit for a digital age.


Given that the previous data sharing code was published by the ICO almost ten years ago in May 2011, one of the ICO’s key objectives when preparing the new Code was to bring its guidance up-to-date to reflect the current regulatory landscape following the implementation of the General Data Protection Regulation (“GDPR”) and the Data Protection Act 2018 (the “DPA”) (the Code also makes reference to the UK’s exit from the European Union and the EU GDPR being written into UK law  through the European Union Withdrawal Act 2018, clarifying that references to the GDPR in the Code should be read as references to the UK GDPR) together with various technologies commonly used by organisations involving personal data e.g. automated decision-making.

Whilst the Code was published on 17 December 2020 and is due to come into force as the end of June 2021, the Information Commissioner has described the publication of the Code as “not a conclusion but as a milestone in this ongoing work” and has already announced plans to update the ICO’s guidance on anonymisation under the Code.

Scope of the Code

Section 121 of the DPA defines data sharing as “the disclosure of personal data by transmission, dissemination or otherwise making it available” and covers data sharing between either separate or joint data controllers and it is this type of data sharing between controllers that the Code focusses on as opposed to data sharing between controllers and processors.

The Code covers two main types of data sharing, namely:

  1. routine (also called systematic) data sharing which is conducted on a regular basis; and
  2. exceptional data sharing, which occurs on a one-off basis, either ad hoc or in emergency situations.

Whilst the previous data sharing code addressed exceptional data sharing to a degree, the Code dedicates a new section to data sharing in urgent and emergency situations, emphasising the benefits of data sharing by referring to recent tragedies such as the fire at Grenfell Tower and terrorist attacks in London. The Code also singles out how proportionate, targeted data sharing (e.g. through the NHS Test and Trace system) can make a positive difference in unprecedented emergencies such as the coronavirus pandemic.

The Code addresses the sharing of personal data, including pseudonymised data (distinct from truly anonymised data), defined by Article 4 of the GDPR as “the processing of personal data in such a manner that the personal data can no longer be attributed to a specific data subject without the use of additional information, provided that such additional information is kept separately and is subject to technical and organisational measures to ensure that the personal data are not attributed to an identified or identifiable natural person”. An example of pseudonymised data is where an organisation shares seemingly anonymous data, but the individual can be re-identified with the help of additional information such as internal identifiers (e.g. account numbers) or publically available information (e.g. social media or voter registration records). In such circumstances, the pseudonymised data needs to be treated as personal data in accordance with the Code and the GDPR.

Practical steps for organisations

While much of the Code serves to bring together existing items of ICO guidance that organisations need to comply with when conducting data sharing (e.g. in relation to ensuring a legal basis has been satisfied in relation to the sharing), the Code offers a number of items of new guidance, which we have summarised below:

1. Data Protection Impact Assessments

Organisations should conduct a Data Protection Impact Assessment (“DPIA”) when considering sharing personal data. The aim of a DPIA is to assess the risk of the data sharing and identify where additional safeguards are needed. Carrying out a DPIA is mandatory for data sharing that is likely to result in a high risk to individuals.

However, due to the accountability principle under the GDPR, organisations will still need to demonstrate their compliance with data protection laws in relation to data sharing. As such, even where it is not mandatory to carry out a DPIA in relation to proposed data sharing, the Code recommends that organisations maintain a record of the reasoning as to why a DPIA has not been undertaken and details of the level of expected risk associated with the data sharing.

When faced with an emergency situation which requires data to be shared in a way that is likely to involve a high risk to individuals, the Code recognises that can often be difficult for organisations to conduct DPIAs in advance of such sharing. Instead, the ICO recommends that organisations who are likely to be responding to emergency situations should consider conducting pre-emptive DPIAs in advance where possible.

2. Data Sharing Agreements

The aim of a Data Sharing Agreement (“DSA”) is to establish the particulars regarding a proposed instance of data sharing between two or more controllers, such as the roles of the parties, the purpose for which data is being shared and compliance standards that each of the parties need to meet in relation to the sharing and any subsequent processing of the personal data in clear and concise language.

The Code provides practical examples of what organisations should include in a DSA, e.g. a model form for seeking individuals’ consent for data sharing (if appropriate), a decision flow diagram to assist with deciding whether or not it is appropriate to share data and a process to be followed by the parties when an individual exercises their rights against either or both of the parties.

Although it is only compulsory under the GDPR to put a DSA in place where a joint controller relationship is established between two or more parties, the Code recommends that organisations also put a DSA in place to address data sharing between separate or independent controllers in addition, especially given that the ICO will take into account the existence of a DSA when assessing any issues or complaints arising out of an instance of data sharing between controllers, irrespective of whether they share data jointly or independently/separately.

3. Responsibility of disclosing party for recipient’s processing of personal data

For a long time, the extent to which an independent controller which discloses personal data to another controller bears responsibility for the recipient’s processing of that personal data has been somewhat unclear. The Code attempts to provide clarity in relation to this issue, stating that an organisation cannot provide personal data to another when it has no visibility over the measures that the recipient has implemented to ensure that the personal data is consistently protected at all stages of the data sharing.

This indicates that an independent controller that discloses personal data to another needs to ensure that the recipient is subject to sufficiently robust contractual obligations and standards in relation to its handling of personal data upon receipt and undertake a degree of due diligence in relation to the underlying arrangements, for example in relation to the security measures that the recipient has in place.

4. M&A transactions

When one or more parties are involved in an M&A transaction or restructuring which involves the sharing of personal data and/or a change in the identity of the controller, the parties involved need to ensure that due diligence extends to examining issues pertaining to the transfer/sharing of personal data in connection with the transaction. This should include conducting an analysis of:

    1. the purposes for which the personal data was originally obtained;
    2. the lawful bases for the processing of such data;
    3. the lawful basis for sharing such data with a third party (for example, whether privacy notices made available to individuals at the time their data was collected stated that their data would be shared/sold to a purchasing organisation in case of an acquisition);
    4. whether, following the acquisition, the purposes for processing is to change (for example, if the selling organisation collected personal data from customers purely to set up an account with them, the buying organisation cannot use this personal data for a different purpose (e.g. research) without carrying out appropriate compliance steps to legitimise this new purpose); and
    5. whether technical advice is required before sharing data, especially when different systems are involved as there is a potential security risk that the data could be lost, corrupted or degraded.

5. Sharing personal data in databases and lists

The transfer or sharing of any database or list of individuals is also addressed in the Code, which places the onus on any recipient of a database or list of personal data from another party to establish the provenance or integrity of this data and to ensure that all compliance obligations have been met prior to exploiting or otherwise using the data.

The Code makes various recommendations in relation to confirming the source of the data, identifying the lawful basis on which it was obtained, checking what the individuals were told when their data was collected and that the data is accurate and up-to-date.

The Code also refers out to the ICO’s detailed guidance on direct marketing, which indicates that a recipient of a database or list of personal data cannot rely on marketing consents obtained another party to justify its own use of this personal data for direct marketing purposes unless the original consent specifically named the recipient who wishes to rely on the consent.

A code fit for a digital age

As previously mentioned, the ICO’s intention is for the Code to be “up-to-date on current cyber-related privacy issues and to provide a roadmap in anticipating future technological developments”.

The Code seeks to address items such as automated decision-making, which has become more prevalent since the previous code was published and touches on the difference between anonymised data and pseudonymised data.

Automated decision-making

Article 22 of the GDPR sets out the rules which apply to organisations which carry out automated decision making i.e. a decision made with no human influence on the outcome.

The Code makes it clear that a number of steps need to be taken in relation to any data sharing arrangement involving automated decision-making (e.g. if an organisation were to use an algorithm to determine whether or not an individual’s personal data is shared with a third party recipient) namely that a DPIA must be carried out, all requirements set out in Article 22 need to be met in relation to the processing (e.g. individuals should receive an explanation on their rights to challenge a decision and request human intervention) and measures need to be put in place to prevent errors, bias and discrimination in the system.

The difference between anonymised data and pseudonymised data

The Code distinguishes between anonymised and pseudonymised data, specifying that the Code applies to  pseudonymised data (where the individual can be re-identified from data with the use of additional information) but does not extend to truly anonymised data (where the information cannot identify an individual in any way). An example of pseudonymised data would be a spreadsheet containing travel data with the names and addresses of relevant individuals redacted but which could be combined with other data available to the organisation to re-identify the individuals e.g. publicly available information such as social media account details or even an un-redacted version of the spreadsheet stored separately to the redacted version. Conversely, an example of anonymised information would be the publication of data at an aggregated level, which means that the data is stripped of any element that would identify any individuals.

The ICO has recently published a blog post stating that they are gathering insight and feedback over the coming months before publishing further guidance on anonymisation and pseudonymisation.

It will be interesting to see whether or not the updated guidance addresses a number of perennial questions, namely:

  1. Does the same level of protection apply to pseudonymised data as traditional personal data under the GDPR?
  2. What steps need to be taken to change the status of pseudonymised data to anonymised data e.g. if an organisation destroys what they consider to be all additional information that would allow them to re-identify individuals in the pseudonymised data before sharing the data with a third party, does this render the data truly anonymised?
  3. If pseudonymised data is shared with a third party which has no access to the information to re-identify the individuals (which is kept confidentially by the disclosing party only), does the third party still need to treat the data as pseudonymised data if the data is effectively anonymised in the hands of this third party? Further, is the third party responsible for ensuring that the data is kept accurate and up to date when the third party does not have the information to identify the individuals without assistance from the disclosing party?
  4. What provisions need to be included in a DSA governing the sharing of pseudonymised data?


Whilst the Code is far from revolutionary and does not set out any guidance that deviates substantially from what has gone before, organisations should carefully review their data sharing practices against the Code and keep track of upcoming guidance and resources published by the ICO which relates to data sharing, as well as deadlines for enforcement.

Duc Tran
Duc Tran
Senior Associate, Data Protection and Privacy, London
+44 20 7466 2954

China – Cyber security and data protection April round up

The financial regulators have continued to increase their efforts to develop and protect financial data. The People’s Bank of China released new standards on enhancing the data capability of financial institutions. Further, several banks were penalized for violating data protection rules in relation to processing of personal information.

MIIT has maintained its focus on its push for data protection in mobile apps remain. In addition to drafting a dedicated regulation for data protection for mobile apps, the MIIT and its local branches have run continuous enforcement campaigns against data privacy violations made by mobile app operators.

Regulatory developments

1. New guidelines issued for financial industry data capacity building

On 9 February, the People’s Bank of China (PBC) issued the Guidelines for Financial Industry Data Capability Building. The Guidelines specify the division of data strategy, data governance, data architecture, data specification, data protection, data quality, data application, and data life cycle management capabilities. The guidelines aim to provide basis for financial institutions to carry out data work, guide financial institutions to strengthen data strategic planning, focus on data governance, and strengthen data security protection.

2. General Requirements for the Safety of Critical Cyber Equipment

On 20 February, the State Administration for Market Regulation and the Standardization Administration approved seven mandatory national standards (including a telecommunications mandatory national standard) and made one amendment to the General Requirements for Safety of Critical Cyber Equipment, which will come into force on 1 August 2021. These requirements (including security function and security protection requirements) serve as important standards for the implementation of the Cybersecurity Law relating to security requirements of critical cyber equipment There are 10 parts to the security function requirements which focuses on ensuring and improving the security technology capabilities of devices. They are, device identification security, redundant backup recovery and abnormal detection, vulnerability and malicious program prevention, pre-installed software start up and update security, user identification and authentication, access control security, log audit security, communication security, data security and password requirements. Separately, security protection requirements focus on standardizing the security capability of critical cyber equipment providers throughout the equipment life cycle.

3. Five draft standards on national information security technology released for public comments

On 3 February, the Secretariat of the National Information Security Standardization Technical Committee (NISSTC) issued two draft standards on instance message service and express logistics for public comments. Further, on 24 February, NISSTC issued three draft standards on online shopping services, internet payment services and online audio and video services, for public comments. This series of standards set requirements for the type, scope, methods, conditions, and data security protection of data collection, storage, use, transfer and delete. They also provide examples of data classification and guidance for the operators to regulate data activities and for supervision authority and third-party assessment agencies to carry out supervision, management and assessments.

4. New rules on app governance to strengthen personal information protection to be published

On 7 February, the Ministry of Industry and Information Technology (MIIT) announced that it has been drafting the interim provisions on personal information protection of apps. The provisions will define the basic principles of informed consent and minimum necessary personal information protection. The principle of informed consent requires that for app-related personal information processing activity, the entities (i.e. entity processing the data) should inform users of the rules of personal information processing in a clear and easy to understand manner, and the user should voluntarily make clear their consent. The minimum necessary principle requires that there shall be clear and reasonable consent during the personal information processing, and it shall not go beyond the scope of users’ consent or unrelated to service scenarios.

Enforcement developments

1. Second group of apps 2021 declared to be infringing users’ rights released, 11th group in total

On 5 February, MIIT published a notification on apps which violated user rights by the misuse of microphones, address books and photo albums. It noted that 26 apps had failed to take the necessary rectification measures, with the deadline for doing so being 10 February. If rectification is not made within the time limit, MIIT will organize and carry out relevant disposal work in accordance with laws and regulations. The issues with the apps were due to violations of mobile phone personal information, frequent and excessive requests for permissions, making mandatory for users to use the targeted push notification function, and inadequate indication to users of app information on the application distribution platform.

2. 37 apps in violation of user rights were removed from the app store

On 3 February, MIIT announced that it had removed 37 apps from the app store that violated user rights and failed to take necessary rectification measures. The removed apps collect personal information beyond the necessary scope and were involved in other issues that violated user rights. To recap, MIIT has carried out special rectification actions for two consecutive years against apps that illegally handled users’ personal information. In addition, MIIT also announced that it will strengthen rectification efforts by promoting the development of relevant standards, and actively applying new technologies such as artificial intelligence and big data to promote the construction of a national app technology testing platform.

3. Guangdong Communications Administration ordered to rectify 215 apps infringing users’ rights

On 22 February, Guangdong Communications Administration notified 215 apps that required rectification. The type of apps can be divided into 13 categories, which include, games, shopping, social networking, financial management, etc. Of the 215, 116 of them have cybersecurity issues. The infringement on user rights and interests include: (1) failing to specify the purpose, method, and scope of personal information collected and used by the third-party SDK integrated by the apps in the privacy policy; (2) applying for terminal permission in advance before the user has read and agreed to the privacy policy; (3) applying for opening address book, location, SMS, recording, camera in advance when users are not using relevant functions or services; (4) no effective account cancellation function provided and no cancellation guidance on the privacy policy nor on the actual platform.

4. Two financial institutions fined for illegal processing of personal information

On 2 February, according to the administrative penalty information form released by the business management department of PBC, Beijing Guoxu Small Loan Co., Ltd. was fined 160,000 yuan for dislcosing personal information without notifying the data subject. . Further, Xinhan Bank (China) Co., Ltd. was fined 570,000 yuan for inquiring about personal credit information without consent, and the relevant person in charge was also fined 114,000 yuan.

5. ICBC Liaocheng branch was fined 36,000 yuan for data breach

On 18 February, according to the announcement of the PBC Liaocheng branch, Liaocheng branch of ICBC was fined 36,000 yuan for inquiring about personal information without the consent of the data subject. Wang Hongqing, the general manager of bank card center, the person in charge, was also fined 8,000 yuan.

6. Liaoning Branch of Bank of China was fined for failing to collect and use consumers’ personal financial information as required

On 3 February, the administrative penalty information published by the Shenyang Branch of PBC showed that the Liaoning Branch of the Bank of China which had five counts on data protection violations, was fined 1.147 million yuan. . The violations included, among other things, failure to collect and use consumer personal financial information as required.

7. Qianbao Pay was punished for failing to keep customer identity information as required

On 24 February, according to the administrative penalty information publicity form published by the Chongqing Business Management Department of PBC, Chongqing Qianbao Technology Service Co., Ltd. which had 10 counts of data protection violations was fined 8.68 million yuan. These violations included failure to keep customer identity information as required. The company’s deputy general manager and chief compliance officer, and other five relevant persons were also jointly fined, ranging from a warning to a fine between50,000 to 135,000 yuan. The company’s violations in personal information protection and data security related to them in the midst of ensuring consistency of transaction information in the whole payment process, they had failed to perform the customer identification obligations and retain the required customer’s identity.

8. Maimai was convicted of infringement for sending text messages to unregistered users

On 7 February, Beijing Haidian District Court announced the judgment of Maimai’s infringement of data privacy. In brief, it was found that the Maimai’s website operated by Beijing Taoyou Tianxia Technology Development Co., Ltd., had sent text messages to users in the name of a friend without the user’s permission. It disclosed the user’s real name, and included a message that certain former colleagues have identified the user and many friends are waiting for them to join via a link .When the user clicks the link, the webpage will direct them to the registration page of Maimai’s website. The user subsequently sued Maimai at court by claiming for specific performance including for the website to cease the infringement of his privacy, permanently deleting his personal information, and publishing an apology statement on China Consumer News. The Beijing Haidian District Court found that the defendant’s actions illegally obtained and retained the plaintiff’s personal information such as mobile phone contact information, personal information of the plaintiff’s friend and resume. Further, Maimai had sent unsolicited messages for commercial gain to the plaintiff without consent, which disturbed the plaintiff’s right of peace and privacy. The judgment awarded all the claims of the plaintiff.

Industry developments

1. The National Information Security Standardization Technical Committee released the key action pointsfor 2021

On 25 February, the National Information Security Standardization Technical Committee released the key action points for in 2021, covering seven categories including focusing on the urgent need of national network security work and improving the effective supply of standards. The document points out that it will further develop national standards for network security in the fields of industrial Internet, blockchain, artificial intelligence and algorithms, Internet of things and digital currency, prepare white papers or research reports on network security standardization such as 5G security, face recognition security and network security talents, as well as practical guidelines for data classification and classification and data sharing security.

2. The National Equity Exchange and Quotations Company participated in the 11th joint emergency drill on network security

On 27 February, according to the Circular of the China Securities Regulatory Commission on the 11th joint emergency drill on network security of securities and futures industry, the National Equity Exchange and Quotations Company participated in the joint emergency drill on network security. Other participants included China Securities Depository and Clearing Corporation Limited, Shenzhen Securities Communication Co., Ltd., China Securities Index Co., Ltd. and other host securities companies.

International developments

1. EDPB held the 45th plenary session and adopted a wide range of documents

On 2 February, the European Data Protection Board held the 45th plenary session. It adopted a statement on the draft provisions on a protocol to the Cybercrime Convention, recommendations on the adequacy referential under the Law Enforcement Directive (LED), an opinion on the draft Administrative Arrangement (AA) for transfers of personal data between the Haut Conseil du Commissariat aux Comptes (H3C) and the Public Company Accounting Oversight Board (PCAOB), and response to the European Commission questionnaire on processing personal data for scientific research, focusing on health related research. EDPB also had an exchange of views on Whatsapp’s recent Privacy Policy update.

2. EDPS published Opinions on the Digital Services Act and the Digital Markets Act

On 10 February, the European Data Protection Supervisor (EDPS) published Opinions on the Digital Services Act and the Digital Markets Act. It aims to protect individuals’ fundamental rights, including the data protection. For the Digital Services Act, EDPS recommended additional measures to better protect individuals in relation to content moderation, online targeted advertising and recommender systems used by online platforms, such as social media and marketplaces. For the Digital Markets Act, it recommended regulating large online platforms, to promote fair and open digital markets and the fair processing of personal data, to foster competitive digital markets to provide individuals additional choices..

3. German adopted draft law on data protection in telecommunications and telemedia

On 10 February, German Federal Cabinet adopted the draft law on data protection and the privacy protection in telecommunications and telemedia. It plans to replace the existing provisions of the Telecommunications Act 2004 and the Telemedia Act 2007, and implement the Directive on Privacy and Electronic Communications (2002/58/EC). The draft includes provisions on the confidentiality of communications, location data, caller ID display and suppression, end-user directories, technical and organisational precautions, consent for storage of information in terminal equipment, and penalties.

4. Vietnam released the Draft Decree on Personal Data Protection for public comments

On 9 February, the Ministry of Public Security (MPS) of Vietnam released the second version of the Draft Decree on Personal Data Protection. It plans to set more robust rules and provide provisions on data subjects’ specific rights, cross-border transfer of data, and processing of sensitive personal data. Violation may cause temporary suspension of operation, revocation of permission for cross-border data transfer and monetary fines.

5. Virginia passed the Consumer Data Protection Act

On 2 March, the Virginia Consumer Data Protection Act (CDPA) was signed by the governor and will come into effect on 1 January 2023. The CDPA establishes rights for Virginia consumers to control how companies use individuals’ personal data. It stipulates that companies shall protect personal data in their possession and respond to consumers exercising their rights.

6. Danish Data Protection Authorities published Quickguide for setting cookies

On 12 February, the Council for Digital Security, the Danish Business Authority and the Danish Data Protection Agency published a Quickguide for the use of cookies. The Quickguide can be used as a checklist for organizations that set cookies, guiding them on how to comply with both the e-Privacy Directive’s rules for the placement of cookies and the Data Protection Regulation’s rules for the processing of personal data in associated with it.

7. UK ICO published Toolkit for data analytics

On 17 February, the Information Commissioner’s Office of UK (ICO) published Toolkit for organisations considering using data analytics. It aims to help recognise risks to individuals’ rights and freedoms created by the use of data analytics, from the beginning of data analytics project lifecycle. The Toolkit begins by asking questions to determine the legal regime, including lawfulness, accountability and governance, the data protection principles, and data subject rights. It will then produce a report containing tailored advice for the specific data analytics project.

Mark Robinson
Mark Robinson
Partner, Singapore
+65 6868 9808
Nanda Lau
Nanda Lau
Partner, Mainland China
+86 21 2322 2117
James Gong
James Gong
Of Counsel, Mainland China
+86 10 6535 5106

Mandatory data breach notification has been introduced in Singapore, with more changes to follow

Some of the key changes to the Personal Data Protection Act 2012 (“PDPA”) took effect on 1 February 2021. These include a mandatory breach notification regime and new consent exceptions, including an exception which may apply if an organisation has legitimate interests in the collection, use or disclosure of the personal data and the legitimate interests of the organisation or other person outweigh any likely adverse effect to the individual.

The Personal Data Protection (Amendment) Bill was passed by the Singapore Parliament on 2 November 2020, with the changes set to take effect in phases. The first phase of these changes took effect from 1 February 2021.

Changes which have already taken effect as of 1 February 2021

1. Mandatory breach notification

One of the key changes which has now taken effect is the introduction of the mandatory data breach notification requirement.  If a data breach is notifiable, the Personal Data Protection Commission (“PDPC”) must be notified. If certain reporting thresholds are met, the affected individuals must also be notified. The new provisions require that:

  • once an organisation has grounds to believe that a data breach has occurred, the organisation is to carry out an assessment of the data breach in a reasonable and expeditious manner to determine whether the data breach is a notifiable data breach. Generally, the assessment should be completed within 30 calendar days of when the organisation first became aware that a data breach may have taken place.
  • a data breach is notifiable to the PDPC if the data breach: (a) results in, or is likely to result in, significant harm to an affected individual; or (b) is, or is likely to be, of a significant scale (i.e. affecting 500 or more individuals). The organisation must notify the PDPC of the breach as soon as it is practicable to do so and, in any event, no later than 72 hours after establishing that the data breach is notifiable.
  • the organisation must also notify affected individuals of the data breach once the organisation has determined that the data breach is likely to result in significant harm to any individuals to whom the information relates, as soon as it is practicable to provide the individuals with the notification. This will allow the affected individuals the opportunity to take steps to protect themselves from the risks of harm or impact resulting from the data breach (e.g. review suspicious account activities, cancel credit cards, and change passwords).

2. New deemed consent and consent exceptions

Consent is required for collecting, using or disclosing an individual’s personal data. The individual must also be notified of the purpose(s) for which an organisation is collecting, using or disclosing the individual’s personal data on or before such collection, use or disclosure of the personal data. Consent may be given expressly or impliedly by individuals. An individual may also be deemed to have given consent under the PDPA in 3 ways: (a) deemed consent by conduct; (b) deemed consent by contractual necessity; or (c) deemed consent by notification, (as the case may be).

In certain circumstances, the amended PDPA also allows an organisation to collect, use and disclose personal data without the individual’s consent. These exceptions may apply when:

  • the organisation or another person has a legitimate interest in the collection, use or disclosure of the personal data (i.e. the legitimate interest exception);
  • the organisation is a party or prospective party to a business asset transaction with another organisation (i.e. the business asset transaction exception);
  • the organisation is using the personal data for the purposes of business improvement (i.e. the business improvement exception); and
  • the organisation is using the personal data for the purposes of research (i.e. the research exception).

Changes which will take effect later

The following changes have not yet taken effect as of 1 February 2021, but are expected to become effective in the near future:

3. Increased financial penalties for contravention of PDPA

The maximum penalty imposed on organisations for breaches of certain key obligations under the PDPA will be increased to S$1 million or 10% of the organisation’s annual turnover in Singapore, whichever is higher. The increased financial penalties are expected to take effect on a future date to be notified, and no earlier than 1 February 2022.

4. Right to data portability

The recent amendments have also introduced provisions which require an organisation to, at the request of an individual, transmit an individual’s personal data that is in the organisation’s possession or under its control to another organisation in accordance with the prescribed requirements in the PDPA. These provisions, which are found under the new Part VIB[1], have yet to come into effect.

For details on the major changes to the PDPA, please refer to our previous e-bulletin “Singapore data privacy law updates 2020” (click here).

[1] Part VIB has not been added to the PDPA because this Part has not come into effect yet.

Mark Robinson
Mark Robinson
Partner, Singapore
+65 6868 9808
Peggy Chow
Peggy Chow
Senior Associate, Singapore
+65 6868 8054
Sandra Tsao
Sandra Tsao
Of Counsel, Singapore
+65 6812 1353


More than two years after the GDPR came into force, the European Data Protection Board (the “EDPB”) finally published its long-awaited draft guidelines on the concepts of controller and processor on 7 September 2020.

Prior to this date, UK organisations only had the relatively limited guidance set out on the ICO website and the old Article 29 Working Party guidance, which predated the implementation of the GDPR, to go on when attempting to apply these fundamental concepts to real-world scenarios.

The new draft guidelines, which are open for public consultation until 19 October 2020, are split into two parts:

  • Part I addresses the concepts of controller, joint controller, processor and third party/recipient and the scenarios in which these roles should be allocated to parties that are involved in the processing of personal data; and
  • Part II sets out details of the measures that need to be put in place when controller-processor and joint controller relationships arise, providing detailed commentary in relation to the contents of a valid data processing agreement entered into between a controller and processor (“DPA”) and joint controller arrangement.

While the contents of the new draft guidelines largely confirm our existing understanding of these concepts and measures, they do contain some helpful sections which serve to offer clarification in relation to a number of issues that have arisen since the implementation of the GDPR. Other sections, however, arguably serve to complicate certain issues further and it is fair to say that many practical questions that organisations and practitioners have, are likely to remain unanswered.

Taking the positive from the draft guidelines however, we set out below seven practical takeaways for organisations looking to navigate to challenges of these concepts.

1. An organisation does not need to have access to or receive personal data to be deemed a controller

If an organisation instructs another party to carry out processing of personal data, or otherwise has processing carried out on its behalf, the organisation can be deemed a controller without ever having access to or receiving personal data.

This guidance confirms that an organisation that provides detailed instructions to a service provider to process personal data on its behalf (e.g. to conduct market research), but only ever receives statistical output information from that service provider in return will not be excused from having to comply with its obligations under the GDPR as a controller simply because it never sees any personal data.

Although not explicitly addressed, it would seem unlikely when this situation arises that contractual provisions would be sufficient to rebut this assumption. For example, we consider it unlikely that a provision in the contract between an organisation and a processor service provider, expressly prohibiting the service provider from providing the personal data to the counterparty organisation, would be sufficient to avoid the organisation being deemed a controller.

2. A service provider can be a processor even if the main object of the service is not the processing of personal data (but not if it only processes personal data on an incidental basis)

If a service provider provides a service where the main object of that service is not the processing of personal data, but has routine or systematic access to personal data, it will be deemed to be a processor. Conversely, a service provider will not be deemed to be a processor if it only comes across very limited quantities of personal data on an incidental basis.

This guidance clarifies that a service provider such as an IT helpdesk service that routinely accesses personal data (e.g. by liaising directly with a customer’s employees or by screen sharing) will deemed to be a processor even if this is not the main object of its role but another service provider, which has, for example, been instructed to fix a specific software bug and will not have the same level of access to personal data (but might see some inadvertently), will not be deemed to be a processor.

3. A service provider that processes personal data for its own purposes will be deemed a controller in respect of those activities

If a service provider carries out processing of personal data for and on behalf of a customer in accordance with the customer’s instructions, it will be deemed a processor in respect of these processing activities. However, if the service provider also processes personal data for its own purposes in the course of carrying out these services (e.g. to conduct data analytics to assist with improving its services for the benefit of its entire customer base), it will be deemed to be a controller in relation to these processing activities, even if it remains a processor for the majority of the processing activities that it carries out for its customer.

This means that the service provider will need to find a way of complying with its obligations under the GDPR as a controller in respect of these processing activities, including the transparency requirements, and it should also make the extent of these activities clear to its customer in any services agreement.

This guidance also reinforces the idea that a service provider is unlikely to solely act as a processor in relation to all processing activities that it carries out in the context of providing services to a customer and is instead likely to act as a mixture of processor, controller and potentially joint controller in respect of the different processing activities that it carries out under these arrangements. This is something that we regularly see reflected in commercial agreements, although the defining lines between the roles that a party may have are often more difficult to discern.

4. Controllers and processors are equally responsible for putting a DPA in place which meets the requirements of Article 28 of the GDPR

Though the wording of Article 28 does not make it entirely clear as to whether it is the responsibility of: (i) the controller; or (ii) both the controller and processor, to put a DPA in place containing Article 28 compliant provisions, it has traditionally been the controller rather than the processor which has taken it upon itself to ensure that the provisions in the DPA are sufficiently robust and detailed so as to meet this requirement. This is possibly a hangover from the Directive and the Data Protection Act 1998.

The guidelines confirm, however, that fulfilling this obligation is the responsibility of both controller and processor and emphasise that processors are also open to receiving administrative fines under the GDPR, which means that processors need to be equally as proactive and engaged as controllers in relation to ensuring these requirements are met.

5. It is not sufficient for a DPA to merely restate the provisions of Article 28

In the absence of a standard set of regulator-sanctioned DPA clauses, controllers and processors have had to exercise their discretion when determining what to set out in a DPA in order to meet the requirements of Article 28 of the GDPR. Typically, parties tend to set out detailed provisions in a DPA if the processing activities to be undertaken are extensive and/or high-risk, whereas if the processing activities are to be minimal or routine, it is not uncommon to see “light touch” DPA wording which simply cross refers to or incorporates by reference certain elements of Article 28 without any additional detail (e.g. in relation to security, “the processor shall take all measures required under Article 32 of the GDPR”).

The guidelines now make clear that merely restating the requirements Article 28 GDPR is never sufficient or appropriate when drafting a DPA: details of the procedures that processor will follow to assist the controller with meeting the listed obligations under Article 28 of the GDPR (e.g. in relation to personal data breach reporting and adopting adequate technical and organisational measures to ensure the security of processing) will need to be set out, potentially in annexes to the DPA. For many organisations that have spent considerable time and resources repapering their commercial agreements to include Article 28 wording, this push for additional detail which may not already be included in many organisations’ DPAs is unlikely to be welcomed given the time often already required to negotiate the provisions of a DPA with counterparties.

6. A controller-processor relationship will only arise where a processor is a separate legal entity in relation to the controller

The guidelines clarify that a department within a company cannot generally be a processor to another department within the same entity and so it will not be necessary to put a DPA in place when this situation arises.

Although the guidelines do not explicitly address whether this principle also applies to a branch and a head office, it follows that it may also not be necessary to put a DPA in place if one were to process personal data for the other.

7. Attributing the roles of controller, processor and joint controller to parties involved in less straightforward processing relationships will remain a challenging exercise

The guidelines set out a number of new tests to help with applying the concepts of controller, processor and joint controller in practice.

For example, the guidelines state that a party will be deemed to be a controller if exercises a “determinative influence” in respect of the processing and if it determines the “essential means” of processing such as making fundamental decisions with regards to the type of data to be processed, the duration of the processing, the categories of recipients and the categories of data subjects. Conversely, if a party only determines the “non-essential means” of the processing, which might include considerations such as choice of hardware or software to be used, it will be deemed to be a processor.

The guidelines also provide that a joint controller relationship will arise where more than one party holds “decisive influence” in respect of the processing either by making a “common decision” or “converging decisions”, where the processing would not be possible without both parties’ participation and where both parties’ processing activities are inseparable or inextricably linked.

While these new tests are welcome insofar that they serve to flesh out the existing guidance available, they do not make the task of attributing the roles of controller, processor and joint controller to parties involved in complex processing arrangements any easier. In particular, the guidelines do not appear to add much clarity with respect to the concept of joint controllers and when such a relationship will arise. Market practice since implementation of the GDPR has seemed to shy away from parties considering themselves to be joint controllers and the draft guidelines do little to clarify whether such practice is sustainable or not. Arguably, these tests will only serve to complicate matters further by requiring additional layers of analysis to be carried out at the outset of every matter involving the processing of personal data. They also offer no guidance on what to do in circumstances where the contractual parties disagree on the analysis – a situation which is potentially only likely to become more common.

Duc Tran
Duc Tran
Senior Associate, Digital TMT, Sourcing and Data, London
+44 20 7466 2954
Julia Ostendorf
Julia Ostendorf
Trainee Solicitor, London
+44 20 7466 2154

High Court says bank need not comply with numerous and repetitive DSARs which were being used for a collateral purpose

The High Court has dismissed a Part 8 claim against a bank for allegedly failing to provide an adequate response to the claimant’s Data Subject Access Requests (DSARs). This is a noteworthy decision for financial institutions, particularly those with a strong retail customer base, as it highlights the robust approach that the court is willing to take where it suspects the tactical deployment of DSARs against the institution: Lees v Lloyds Bank plc [2020] EWHC 2249 (Ch).

The claimant alleged, among other things, that the bank had failed to provide adequate responses to various DSARs, contrary to the Data Protection Act 2018 (DPA 2018) and the General Data Protection Regulation (EU) 2016/679 (GDPR). The court found that the bank had adequately responded, but gave some strongly-worded obiter commentary on the court’s discretion to refuse an order, even where the claimant can demonstrate that the bank has failed to provide data in accordance with the legislation. In the court’s view, there were good reasons for declining to exercise its discretion in favour of the claimant in this case (even if the bank had failed to provide a proper response), including that: the DSARs issued were numerous and repetitive (which was abusive), the real purpose of the DSARs was to obtain documents rather than personal data, and there was a collateral purpose underpinning the requests (namely, to use the documents in separate litigation with the bank).

In financial mis-selling cases, DSARs are often used by claimants as a tool to obtain documents from a financial institution in advance of the issue of proceedings or during litigation to build their case. DSARs can be made in addition to pre-action and standard disclosure under the CPR, and will often seek to widen the scope of documents that could be obtained via traditional disclosure routes. This can create significant workstreams for the bank, which are time-consuming and costly. The present decision provides some helpful guidance as to when it may be appropriate for banks to resist “nuisance” DSARs. It is unclear whether the conclusion in this case would take precedence over the UK privacy regulator’s guidance with respect to DSARs, which has previously been that they should be “motive blind”, but has more recently suggested that there is no obligation to comply with DSARs that are “manifestly unfounded”.

Finally, a significant practical difficulty for financial institutions, is that DSARs can be received by a number of internal teams within the financial institution, either at intervals or all at once. This decision is an important reminder of the need for centralised monitoring of DSARs.


The claimant individual entered into buy to let mortgages in respect of three properties with the defendant bank between 2010 and 2015. The claimant submitted a number of DSARs to the bank between 2017 and 2019 alongside claims in the County Court and High Court concerning the alleged securitisation of the relevant mortgages in an attempt to prevent possession proceedings by the bank in relation to the properties (which were all held to be totally without merit). The bank responded to all the DSARs it received from the claimant.

The claimant subsequently issued a claim alleging, amongst other things, that the bank had failed to provide data contrary to the DPA 2018 and GDPR.


The court held that the bank had provided adequate responses to the claimant’s DSARs and was not in breach of its obligation to provide data. Given the DSARs under consideration, the court concluded that the DPA 1998 was the legislation in force at the relevant time and this provided data subjects with rights of access to personal data to similar to those under the GDPR. However, given that the subject access rights under the DPA 1998 were essentially the same as those now provided for under the GDPR (and the DPA 2018), it seems likely that the court’s conclusion would have been similar if the case had been considered under the current legislation.

The court commented that even if the claimant could show there was a failure by the bank to provide a proper response to one or more of the DSARs, the court had a discretion as to whether or not to make an order.

In this case, in the court’s view, there were good reasons for declining to exercise the discretion to make an order in favour of the claimant in light of:

  • The issue of numerous and repetitive DSARs which were abusive.
  • The real purpose of the DSARs being to obtain documents rather than personal data.
  • There being a collateral purpose that lay behind the requests which was to obtain assistance in preventing the bank bringing claims for possession.
  • The fact that the data sought would be of no benefit to the claimant.
  • The fact that the possession claims had been the subject of final determinations in the County Court from which all available avenues of appeal had been exhausted. It was improper for the claimant to mount a collateral attack on these orders by issuing this claim.

The court therefore dismissed the claim as in its view it was totally without merit.

Interaction with Information Commissioner’s Office (ICO) guidance

It is worth noting here that the UK privacy regulator’s guidance with respect to DSARs has previously been that they should be “motive blind” and any collateral purpose should not impact whether or not a controller is required to comply.

The latest draft guidance from the ICO refers to DSARs potentially being “manifestly unfounded” (with therefore no obligation to comply) when: (i) the individual clearly has no intention to exercise their right of access (for example an individual makes a request, but then offers to withdraw it in return for some form of benefit from the organisation); or (ii) the request is malicious in intent and is being used to harass an organisation with no real purposes other than to cause disruption.

However, the court’s comments seem to extend this position and it is unclear whether the decision in this case would therefore take precedence over the regulatory guidance – something which would undoubtedly be welcomed by controller organisations.

Julian Copeman
Julian Copeman
+44 20 7466 2168
Miriam Everett
Miriam Everett
+44 20 7466 2378
Ceri Morgan
Ceri Morgan
Professional Support Consultant
+44 20 7466 2948
Nihar Lovell
Nihar Lovell
Senior Associate
+44 20 7374 8000