Following the publication of the new EU Standard Contractual Clauses (“SCCs“) last year and their UK equivalent at the beginning of this year, any current arrangements for transferring personal data outside of Europe or the UK (e.g. international data transfer agreements involving a European or UK party) should be revisited and updated in the coming months. Continue reading
Tag: data protection
Happy International Data Privacy Day! And what better day than today, to explore what 2022 is likely to have in store for data and privacy?
One year on from the introduction of the UK GDPR in a post-Brexit Britain. Two years on from the start of a global pandemic which forced a discussion around the tension between public health and data privacy. And over three years on from the GDPR coming into force across Europe, and by extension the world. But the passing of time does not appear to have diminished the worldwide focus on data and privacy issues.
In this post, we set out some predictions for data protection and privacy UK and EU developments in the year to come.
UK Data Protection Reform
2021 was the year that the UK Government hinted that it might think outside of the box in terms of data protection regulation. In September 2021, the UK Department for Digital, Culture, Media and Sport (“DCMS“) published its wide-ranging consultation on data protection reform. The DCMS Consultation is the first step in the Government’s plan to deliver on ‘Mission 2’ of the National Data Strategy, underpinned by a desire to boost innovation and economic growth for UK businesses while strengthening public trust in the use of data. The proposals were expansive, seeking to create an adaptable and dynamic set of data protection rules that underpin the trustworthy use of data. They mark a move away from a rigid set of rules, towards a more outcome focussed regime, in order to reduce burdens on business. The consultation closed in November 2021 and the results are expected in Spring 2022. For further detail about the reform proposals, please see our blog post, available here.
A new regulator for the UK
On 4 January 2022, John Edwards began his new role as UK Information Commissioner today, on a five year term. The new regulator spent the past eight years as New Zealand Privacy Commissioner, and before that worked as a barrister. He succeeded Elizabeth Denham CBE, whose term as UK Information Commissioner ended last year. The new Information Commissioner’s agenda/approach/priorities will become clearer during his first full year in the role. However, it seems likely that one of his top priorities for 2022 will likely be the introduction of the Age Appropriate Design Code to better protect children online, together with the Online Safety Bill.
The fallout from enforcement – privacy notices and cookies
2021 saw some significant enforcement action – including fines of EUR 746 million, EUR 225 million and EUR 150 million. Interestingly, these significant fines haven’t resulted from big data security breaches but rather we have seen a regulatory focus on data protection principles –particularly transparency – and cookies. Whilst in the UK at least, it is possible that current rules around cookie consents may be ‘relaxed’ as a result of the data reform proposals described above, its seems likely that this kind of significant enforcement could result in widespread updates to privacy notices and cookies practices in 2022. For further details regarding the likely impact on privacy notices in particular, please see our summary, available here.
Testing the EU cooperation mechanism
Although 2021 has seen significant EU GDPR enforcement action as described above, it has also shone a spotlight on the apparent differences of opinion between Member State regulators when it comes to enforcement. In the 2021 WhatsApp enforcement action, objections raised by the EU regulators to the Irish Commissioner’s proposed enforcement resulted a referral to the EDPB for resolution. In December 2021, concerned MEPs also sent a letter to EU Commissioner Reynders to raise concerns about how the Irish Commissioner enforces the GDPR and applies the GDPR’s cooperation mechanism. The MEPs reportedly asked Commissioner Reynders to initiate infringement proceedings against the Irish Commissioner. What is clear is that there is a significant discrepancy between EU supervisory authorities regarding enforcement and the appropriate approach to the same. Could 2022 be the year that the GDPR’s cooperation mechanism is tested to its limits? Or could we see individual Member State regulators forging their own path?
International data transfers – Volume 1 (EU SCC re-papering)
On 27 September 2021, the new EU standard contractual clauses (“New EU SCCs“) came into force for the transfer of personal data from the EEA to third countries under the EU GDPR. From that date, the New EU SCCs have been used for any new agreements entered into that rely on model EU data transfer clauses to legitimise the transfer of personal data from the EEA to third countries under the EU GDPR. Existing Agreements incorporating the old EU SCCs remain valid and provide appropriate safeguards until 27 December 2022, meaning that for many organisations 2022 is likely to involve the not insignificant task of “re-papering” agreements relying on the old EU SCCs and replacing them with the new EU SCCs. For further details regarding the New EU SCCs, please see our blog posts, available here and here.
International data transfers – Volume 2 (the UK position)
In August 2021, the UK Information Commissioner published a consultation on international data transfers. The regulator published a draft international data transfer agreement to address transfers of personal data outside of the UK; a draft international transfer risk assessment guidance note and tool; and a draft UK addendum for inclusion to the European Commission’s standard contractual clauses. The consultation closed on 7 October 2021 and we expect to see legislative proposals in 2022, which will finally give organisations certainty on the approach that the UK is taking to international data transfers, although it is unlikely to be the end of the data transfer saga depending upon the results of the DCMS data protection reform consultation described above. For further details regarding the ICO’s international data transfer proposals, please see our blog post, available here.
International data transfers – Volume 3 (Safe Harbor 3.0?)
Shortly after the Schrems II judgment, the US Department of Commerce and the European Commission initiated discussions to evaluate the potential for an enhanced EU-US Privacy Shield framework to comply with the ruling. However, discussions do not seem to have obviously progressed much during 2021 and, without root and branch reform of US surveillance law, it remains unclear how any such framework would avoid the fate of its predecessors the Privacy Shield and US Safe Harbor. Could 2022 be the year that governments in multiple jurisdictions manage to find a way through the legal complexities raised by the Schrems II judgment in order to allow the international transfer of data on a practical level?
ePrivacy and cookies
We have covered the proposed ePrivacy Regulation in our previous data protection predictions and yet the question remains as to whether 2022 is going to be the year that this legislation makes it through the process. Even without the proposed new EU Regulation, some EU regulators have made their focus on cookies very clear – the CNIL has recently taken significant enforcement action against both Google and Facebook for breaches of the cookie rules. The recent DCMS data protection reform consultation also focussed in part on cookies and questioned the appropriateness of the current rules relating to cookie consents. As a result, whether via legislative or reform or regulator action, it seems clear that cookies will be a special dish in 2022.
Tech vs data regulation – the race continues
2021 has seen a continued focus from organisations and regulators alike on innovative technologies and, in particular, AI. Uptake of AI by organisations appears to have increased alongside attempts by data protection regulators to keep pace, protect the privacy of individuals, and ensure fairness in an increasingly AI-driven world. An example of this was the UK Information Commissioner’s 2021 consultation in relation to the use of the beta version of its AI and data protection risk mitigation and management toolkit. We expect to see even more focus in 2022 on the use of AI and innovative technologies against the backdrop of data privacy legislation. For further details on the ICO AI consultation, please see our blog post, available here.
Class actions reborn?
In November 2021, the Supreme Court overturned the Court of Appeal’s decision in the high profile Lloyd v Google case, which could have opened the floodgates for class actions for compensation for loss of control of personal data to be brought on behalf of very large numbers of individuals without identifying class members. The case was brought under the DPA 1998, rather than the GDPR which superseded it. Whilst there may be read across to the current UK GDPR regime, Lord Leggatt specifically stated that he was not considering the later legislation and this could potentially leave the door open for future loss of control claims under the current law. After Morrisons and now Lloyd v Google, could 2022 be the year that we see another attempted data class action reach the courts? For further details regarding the Supreme Court judgment in the Lloyd v Google case, please see our blog post available here.
 First published by LexisNexis in October 2021
New security assessment rules, which are applicable to the transfer of both important data and personal information outside of China, have been issued for public consultation.
The Cybersecurity Administration of China (“CAC“) released a draft of the Measures for Security Assessment of Cross-border Transfer of Data (“Draft Measures“) for public consultation on 29 October 2021. The deadline for the public to submit comments on the Draft Measures is 28 November 2021. Continue reading
Key highlights – our comments on the cybersecurity probe into DiDi and the draft of the revised Measures on Cybersecurity Review
In early July, the Cyberspace Administration of China (CAC) announced that it had initiated cybersecurity review on three companies, namely DiDi, Boss Zhipin and Full Truck Alliance, and during the review the three companies are not permitted to register new users in order to “prevent spreading of risks”. In addition, the CAC also orders application stores to remove DiDi’s application due to “serious violations in collecting and using personal information”. Notably, all of the three companies were listed in the United States in June 2021.
There are very few details available to the public about the proposed cybersecurity review except for the fact that it has been initiated. The cybersecurity review is one of measures contemplated under the Cybersecurity Law (CSL) in order to ensure supply chain security of the critical information infrastructure (CII) through a review of the procurement of network product and services that may impact national security. One of the reasons why it had not been invoked till recently is that the scope of CII has not been identified. Although the CSL requires the State Council to publish regulations on the protection of CII, the CAC only released a draft regulation in July 2017. The guidance on identifying CII as contemplated in the draft regulation has never been published. Without knowing whether the information facilities are considered CII, it is almost impossible put the security review and all the other relevant CII protection measures into practice. The State Council seems to have been aware of this, and has included the regulation on CII protection in their legislative agenda for 2019, 2020 and 2021. We hope this regulation will finally be published this year.
In the absence of CII identification guidance, the first question here is how DiDi is identified as an operator of CII. Although it might meet the criteria set out in the general definition of CII under the CSL, we expect that at least a identification procedure should be followed to justify the decision, and it is unclear whether DiDi was aware of the fact that it was considered a CII operator before the decision for cybersecurity review was made.
Another question is which network product or service procured by DiDi has impacted national security. There is no indication in the announcement by the CAC, and it remains to be seen how the CAC will interpret and assess the procurement on national security.
There are also questions on the enforcement measures. The regulation on cybersecurity review does not empower CAC to take any enforcement measures alongside the initiation of the review. In terms of penalties, the CSL only permit the authority to order the CII Operator to cease using the relevant network products or services, and to impose a fine of up to 10 times of the purchase amount on the CII operator and a fine of up to RMB 100,000 on the persons responsible, if the CII operators use the unauthorised products or services. The CSL provision also allow the authorities to require network operators to take technical or other necessary measures to prevent contain harm in the event of a cybersecurity incident. In this case, DiDi has been ordered to stop registering new users, and the CAC may rely on such provision to take the measures, although the announcement does not mention that a cybersecurity has occurred.
As the Data Security Law (DSL) is not enforceable yet, CAC is not able to invoke any measures provided thereunder if there is any allegation concerning DiDi’s data (especially important data) processing activities. The national security review regime proposed under the DSL is even further from becoming enforceable. The CAC does not specify the data protection laws and regulations pursuant to which it ordered the removal of DiDi application from application store. Considering that the Personal Information Protection Law (PIPL) is yet to be enacted, it is likely that the decision is based on the CSL and the relevant regulations on processing of personal information by mobile applications.
As discussed, the factual basis for CAC’s decisions remains unclear. It is worth pointing out that at this point, there is still a very limited number of enforceable laws and regulations in cybersecurity and data protection that the authority can actually rely on for their enforcement actions. The CSL and the cybersecurity review regulation are the most readily available from an enforcement perspective at this point.
CAC seems to have realized the inadequacy of the current regulations. On 10 July 2021, the CAC released a draft of the revised Measures on Cybersecurity Review for public consultation. Notably, the revised draft has extended the scope of cybersecurity review to cover data processing activities of data processors that may impact national security. The extension is apparently intended to implement the national security review on data processing activities as contemplated under the DSL.
The revised draft has special focus on listing of companies outside China that process core data, important data and large amount of personal information. Any operator that processes personal information of over 1 million users must apply to the CAC for cybersecurity review before the operator is listed outside China. The CAC will assess the risks of CII, core data, important data or personal data being “influenced, controlled or maliciously used” by foreign governments if an operator is listed outside of China. The revised draft also includes China Securities Regulatory Commission (CSRC) as one of the ministries responsible for the review.
The revised draft is apparently targeted at companies, such as DiDi, which have launched or going to launch their initial publish offering outside China and at the same time process a large amount of personal information, important data or even core data within China. One critical question is that if the listing of a company is considered to have impacted national security after review what actions the CAC and CSRC will take, e.g. whether the company will be order to delist. For the companies that plan to be listed outside China, the cybersecurity review will bring great uncertainty to their listing process and could potentially affect their decision as to the place of listing.
An interesting point is whether a listing in Hong Kong will be subject to the cybersecurity review. The revised draft uses the term “listing outside China”, instead of the traditional expression of “overseas listing” used in the context of securities laws which usually includes Hong Kong listings. It is unclear whether this indicates that Hong Kong listings are excluded from the scope of review, and CAC should clarify this point in their final draft.
Data processing and cybersecurity compliance are now under closer scrutiny by the government. Although there are still questions surrounding the decisions on Didi, and the revised Measures on Cybersecurity Review is still in a draft, no doubt companies, especially the technology companies, should pay more attention to their compliance with data and cybersecurity laws, in anticipation of the upcoming DSL, PIPL and the implementing regulations. Companies that process important data or sizable amount of personal information or operate CII should particular heed the regulations and actions of CAC.
If you would like to know more about the cybersecurity review, please click below link to read our previous article on the Measures on Cybersecurity Review published in June 2020.
If you would like to know more about the newly-enacted Data Security Law, please click below link to read our comments.
On 10 June 2021, the Standing Committee of the 13th National People’s Congress voted through the Data Security Law after a third reading, which will become enforceable from 1 September 2021. Compared with the second draft, key changes in the final version are: (i) the commission will establish a coordination mechanism for national data security (Mechanism), and the Mechanism will coordinate relevant ministries to draft the catalogue of important data and strengthen the acquisition, analysis, research and early warning of risk information; (ii) a new concept of “core data” of the state is introduced, which is defined as data relevant to national security, national economic lifeblood, important livelihood of people and significant public interest. Core data will be subject to an even more rigorous protection regime; (iii) personal information processing activities shall comply with Data Security Law as well as relevant laws and regulations
On 22 June, the Ministry of Industry and Information Technology (“MIIT”) issued the Notice on Strengthening the Network Security of Vehicle Networking for public consultation. This Notice consists of four aspects: strengthening the network security protection of the vehicle networking, strengthening the security protection of the platform, ensuring data security, and strengthening security vulnerability management. It aims to guide basic telecommunications enterprises, intelligent connected vehicle operation enterprises, and intelligent connected vehicle production enterprises to strengthen the network security management of the vehicle networking (intelligent connected vehicle), accelerate the improvement of the ability of guaranteeing cybersecurity, and promote the healthy development of the vehicle networking industry.
On 11 June, MIIT issued the Notice on Strengthening the Management of Name Registration of Vehicle Networking Card for public consultation. According to this Notice, the MIIT is responsible for the organization, management and overall promotion of nationwide name registration of vehicle networking cards. Vehicle enterprises are responsible for the name registration of the vehicle networking cards of vehicles produced and sold by them pursuant to the relevant requirements of the competent authorities. The Vehicle enterprises shall establish strict management systems for the purchase, use and name registration of the vehicle networking cards, build name registration management platforms of vehicle networking cards, and improve the user information protection system. This notice also provides that telecom enterprises should strengthen the management of basic resources of vehicle networking cards.
On 21 June, MIIT issued the Vehicles Networking Security Standard System Construction Guide for public consultation. This Guide points out that efforts should be made to build the cybersecurity standard system of the vehicles networking, so as to provide support for the safe and sustainable development of the vehicles networking industry. By the end of 2023, basic network security standard system of the vehicles networking shall be built, and by 2025, a relatively complete network security standard system of the vehicles networking shall form. This Guide elaborates the construction ideas, construction contents and implementation scheme of the network security standard system of the vehicles networking.
On 4 June, the Statistic and Information Centre of the National Health Commission issued the “Internet Medical and Health Information Security Management Specification (Draft for Comments)”, an industry standard, for public consultation. It specifies the regulations and security requirements for the overall framework of information security management of Internet medical and health, management of information security related party, management of information security process, management of information security data, management of information security technology, and management of information security organization, and it is applicable to information security management in Internet medical and health activities carried out by organizations and individuals in China.
On 21 June, the Science and Technology Department of the National Radio and Television Administration released the Basic Requirements for Multi-level Protection of Cybersecurity in Radio and Television, which has been reviewed and approved by the National Radio, Film and Television Standardization Technical Committee. Basic Requirements for Multi-level Protection of Cybersecurity in Radio and Television was formulated by the relevant entities organized by the National Radio and Television Administration, which stipulates the general requirements and extended requirements for the security of objects from the first level to the fourth level of classified protection objects in radio and television network security and the requirements for security expansion.
On 17 June, the Supreme People’s Court, the Supreme People’s Procuratorate, and the Ministry of Public Security issued Opinions on Several Issues Concerning the Application of Law in Handling Criminal Cases Such as Telecommunications and Network Fraud (“Opinions”). The Opinions clearly stipulate the identification of the crime location of telecommunications and network fraud, the situation of trying the cases together, illegal possession of credit cards, the crime of fraud, the crime of infringing on citizens’ personal information, and the crime of forging identity documents, which are conducive to further clarification of legal standards, and severely punishing telecommunication network and fraud crimes in accordance with the law.
On 11 June, Cyberspace Administration of China (“CAC”) reported 129 apps widely used by the public, including Keep, Joyrun, Xiaomi Sports, Jinri Toutiao, Tencent News, Nike, Zhenai.com, etc., covering the field of sports, news information, webcast, app store, and women’s health. These apps collected personal information unrelated to the services, violating the necessity principle and the Regulations on the Scope of Necessary Personal Information for Common Types of Mobile Internet Applications, collect and use personal information without the users’ consent, and do not provide the function of deleting and correcting personal information and the channels for filing complaints.
On June 8, MIIT issued a notification of apps that violate users’ rights and interests (the fifth batch in 2021, the 14th batch in total). Previously, MIIT organized a third-party inspection agency to inspect mobile apps, requiring relevant companies to make rectifications. 291 apps had not finished the rectification until June 8, and there were problems like illegal collection of personal information in these apps. These APPs should complete the rectification and implementation work before June 16. If rectification is not finished within the time limit, MIIT will take disposal measures in accordance with laws and regulations.
On 12 June, the Beijing branch of CAC, Beijing Public Security Bureau, Beijing Market Supervision Bureau, and Beijing Communications Administration issued a notice on the launch of special governance of app cybersecurity in Beijing in 2021, and decided to carry out a special governance action on the illegal collection and use of personal information of apps in the city from the date of issuing the notice to November, requiring app operators to collect and use personal information in accordance with laws and regulations, be responsible for the security of the collected personal information, and take effective measures to strengthen personal information protection. This special action is based on the Cyber Security Law, Notice on Issuing the Methods for Identifying the Collection and Use of Personal Information in Violations of App Laws and Regulations, Notice on Issuing the Regulations on the Scope of Necessary Personal Information for Common Types of Mobile Apps, etc., to provide in-depth control of the illegal collection and use of personal information by App operators.
On 1 June, Zhejiang Cyberspace Administration Office, Zhejiang Public Security Department, Zhejiang Market Supervision Bureau, and Zhejiang Communications Administration issued an announcement on the jointly launch of a special treatment for illegal collection and use of personal information by apps from June to December in 2021. This special treatment focuses on apps that have a large number of users, are closely related to people’s lives, or are complained by citizens. Relevant departments will carry out specific rectification for apps that have hidden security hazards such as illegal collection and use of personal information and causing personal information leakage.
On 2 June, MIIT and the Ministry of Public Security issued a notice on clearing and rectifying fraudulent phone cards, Internet of Things cards, and associated Internet accounts, requiring people who illegally handle, rent, sell, buy, and hoard of phone cards and Internet of Things cards and relative person of Internet accounts, as of the date of this notice, to stop related activities, and cancel related phone cards, Internet of Things cards, and related Internet accounts before the end of June 2021. Relevant departments will fight against illegal handling, renting, and selling, buying and hoarding phone cards, Internet of Things cards, and associated Internet accounts in accordance with the law.
On 8 June, the General Office of MIIT issued a notice on launching a pilot program for identity authentication and security trust in vehicles networking. The pilot direction consists of four aspects: vehicle-to-cloud safe communication, vehicle-to-vehicle safe communication, vehicle-to-road safe communication, and vehicle-to-device safety communication. Basic telecommunications companies, Internet companies, automobile manufacturers, electronic parts companies and other entities can apply for pilot projects for Internet of vehicles networking authentication and security trust. MIIT will select projects that fulfil the requirements of carrying out the pilot work. The pilot entities should take the key responsibilities of cybersecurity, improve the corporate cybersecurity management system, and implement cybersecurity protection requirements.
On 11 June, the Information and Communications Administration Bureau of the Ministry of Industry and Information Technology held an administrative guidance meeting, at which the Ministry of Industry and Information Technology warned e-commerce platform enterprises to standardize the sending of short messages in marketing and strengthen industry self-discipline. Alibaba, JD, PDD and other major e-commerce platform enterprises have made a solemn commitment to strictly implement the relevant requirements on garbage information control, conduct comprehensive self-inspection and self-correction, improve the management system, optimize user services, ensure the achievement of tangible results in a short time, and constantly enhance the sense of gain, happiness and security of the vast majority of users.
On 9 June, President Biden signed Executive Order on Protecting Americans’ Sensitive Data from Foreign Adversaries, repealing and superseding three executive orders aimed at prohibiting transactions with TikTok, WeChat, and eight other communications and finance-technology software applications. The decision mainly includes the following three aspects: enabling the United States to take strong measures to protect sensitive data of the United States; developing standards for identifying software applications that may pose unacceptable risks, and further developing plans to protect sensitive personal data against potential threats posed by certain connected software applications.
On 18 June, EDPB and EDPS issued a Joint Opinion on the proposal for a Regulation of the European Parliament and of the Council laying down harmonized rules on artificial intelligence (Artificial Intelligence Act). (Artificial Intelligence Act) In the Joint Opinion, EDPB and EDPS issued a call for a total ban on the use of AI in public places to automatically recognize human features, such as faces, but also on other biological or behavioral signals such as gait, fingerprints, DNA, and voice.
On 18 June, EDPB updated the Supplemental Measures Guide to the standard contractual clauses for the transfer of personal data to third countries pursuant to Regulation published on June 4 as a follow-up to the Schrems II decision of the European Court of Justice. It provides a number of steps to be followed, potential sources of information, and examples of supplemental measures that could be implemented to assist senders in the complex task of assessing third countries.
On 28 June, the EU recognized that Britain’s privacy rules are commensurate with EU rules, a key step that will allow the flow of data between the EU and the UK to continue after Brexit. Meanwhile, the EU added a “sunset clause” to set a four-year term for the decision. If during this period, the UK has major differences with the EU on data standards, the European Commission may intervene.
5. HiQ’s Grabbing of LinkedIn User Information Case Requested A New Trial
On 14 June, the US Supreme Court asked the lower courts to review the case regarding the grabbing of LinkedIn User Information by hiQ. An earlier decision held that LinkedIn should not prohibit competitor hiQ Labs from collecting personal information from LinkedIn users’ public personal information. LinkedIn believes that the use of “robots” for large-scale grabbing of personal information would pose a serious threat to user privacy. Rival hiQ Labs argues that it does not sell user information and that LinkedIn’s lawsuit is aimed at monopolizing public data, hurting the openness and innovation of the Internet. Although hiQ Labs does not sell user information captured, for LinkedIn, the “user privacy risk” associated with data being captured by various crawler tools does exist. In April, it was reported that the archive of data captured from 500 million LinkedIn resumes was sold on a hacker forum.
On 25 June (Thursday, local US time), Google announced that its Chrome Internet browser would stop supporting a user-tracking technology called third-party cookies by the end of 2023, nearly two years after its original time frame for early 2022. Under pressure from privacy regulators and advocates, Google had previously announced that it would remove cookies, which many companies in the advertising industry use to track individuals and target ads. Google said the delay would give publishers, the advertising industry and regulators more time to familiarize themselves with the new technology it is developing and testing to continue to target ads after cookies exit.
Simultaneous with the European Commission publishing its final standard contractual clauses for the international transfer of personal data (see our blog post here for further information) (the “New SCCs“), they have now published a final set of standalone Article 28 clauses for use between controllers and processors in the EU, also termed ‘standard contractual clauses’ (the “Final Article 28 Clauses“) (available here). Continue reading
On 28 May 2021, the Information Commissioner’s Office (“ICO“) published a call for views on the first draft chapter of its anonymisation, pseudonymisation and privacy enhancing technologies draft guidance). This first chapter is part of a series of chapters of guidance that the ICO will be publishing on anonymisation and pseudonymisation and their role in enabling safe and lawful data sharing. Addressed to organisations seeking to anonymise personal data, it seeks to define anonymisation and pseudonymisation and provides some practical advice to such organisations on how to manage their obligations.
The guidance supplements the ICO’s Data Sharing Code of Practice (the “Code“), which we discussed in our blog post here. The Code contained guidance on the aspects organisations need to consider while sharing personal data. While the Code briefly touched upon anonymisation and pseudonymisation, it did not address some of the key issues that arise time and again when organisations seek to anonymise and pseudonymise data. This new series of guidance, with its specific focus on anonymisation and pseudonymisation, will hopefully address these issues.
In this blog post, we discuss our key takeaways from the first chapter of the guidance and the impact that it is likely to have.
What is Anonymisation?
The first chapter of the guidance explains that anonymous information is data which does not relate to an identified or identifiable individual. Data protection law does not apply to truly anonymous information. According to the guidance, anonymisation is the way in which personal data is turned into anonymous information, and includes the techniques and approaches which can be used to this end.
The guidance also clarifies that it is not necessary that anonymisation be free of risks, and emphasises that the risk of re-identification should be mitigated to the extent that it becomes sufficiently remote and that information becomes ‘effectively anonymised’. In this respect, guidance states that anonymisation is ‘effective’ when: (a) it does not relate to an identified or identifiable individual; or (b) is rendered anonymous in such a way that individuals are not (or are no longer) identifiable.
Importantly, the guidance makes the important clarification that applying anonymisation techniques to render personal data anonymous is considered a processing activity in of itself, and data protection requirements have to be adhered to while undertaking such processing, which includes informing data subjects that this is to take place.
What is Pseudonymisation?
The first chapter of the guidance confirms that pseudonymisation is a method used to remove or replace information that identifies an individual, for example, by replacing names or other identifiers with codes or numbers. However, the guidance cautions that organisations must take care to maintain the additional information (i.e. the identifiers) separately and protect it using appropriate technical and organisational measures, as individuals can be identified by reference to this additional information.
Crucially, the guidance seeks to address one of the long-debated questions surrounding pseudonymised data – can pseudonymised data be considered anonymised data in the hands of a third party who has no means to re-identify that data?
In this respect, the guidance clarifies that when transferring the pseudonymous data to a third party, an organisation needs to consider the context and circumstances of the transfer – if the third party has no means which are reasonably likely to re-identify the individuals in the transferred dataset, the dataset may be considered ‘anonymous information’ in the hands of the third party. However, should the transferring organisation still have access to the additional information which can identify individuals, the dataset will continue to be personal data in that organisation’s hands. Whilst many organisations have been operating under the assumption that pseudonymised data can be considered anonymised data in the hands of a recipient without the means re-identify that data, this is a welcome and important clarification.
Accordingly, both disclosing and recipient organisations will need to carefully consider whether the data is anonymous or pseudonymous in their hands, to consider their data protection obligations.
The guidance also sets out that pseudonymous data is still personal data and data protection law applies to such data. However, it does not specify if there is any degree of difference in how data protection law will apply to conventional personal data and pseudonymous data. We expect that the ICO will address this issue in the remaining chapters of its anonymisation and pseudonymisation guidance. It remains to be seen what the obligations of a recipient third party will be in context of a pseudonymised dataset it receives, when it does not have the additional information which can re-identify individuals from that dataset.
The Way Forward
The ICO will be publishing further chapters of its anonymisation and psuedonymisation guidance on identifiability, pseudonymisation techniques, accountability and governance requirements, amongst other topics. These upcoming chapters will hopefully provide further guidance and clarity on the obligations of organisations while sharing pseudonymised data and best practice to be followed in order to ensure compliance with data protection requirements.
This e-bulletin summarises the latest developments in cybersecurity and data protection in China with a focus on the regulatory, enforcement, industry and international developments in this area.
In late April, we saw the second reading of the proposed Personal Information Protection Law (PIPL) and Data Security Law (DSL) by the Standing Committee of the National People’s Congress, which marks a step closer to the enactment of these two milestone legislations. We have prepared an e-bulletin on the key changes in the second draft. Please click below link for further reading.
1. PIPL was released for public consultation
On 29 April 2021, the China National People’s Congress released PIPL for public consultation. The key changes of the Personal Information Protection Law (Second Review Draft) are as follows. The first is the inclusion of an additional legal basis for processing personal data which is if it is “within a reasonable scope in accordance with the provisions of this law”. Next, the revised draft PIPL stipulates that personal information processors shall provide individuals with convenient methods to withdraw their consent and the withdrawal of their consent shall not affect the effectiveness of personal information processing activities that have been carried out before this withdrawal. Further, there are new regulations in relation to the obligations of large Internet platforms for protecting personal information. The draft PIPL also consolidate the provisions in relation to cross-border transfer of personal information which shall be for either business needs or for judicial or law enforcement authorities. In addition, the revised PIPL also includes new provisions on the protection of the personal information of deceased individuals. Finally, the revised PIPL has clarified the burden of proof from the individual to the data processor in civil personal infringement cases
2. DSL was released for public consultation
On 29 April 2021, the China National People’s Congress released the DSL for public consultation. The DSL includes revised regulations that the state will establish a data classification and grading protection system and determine important data catalogues to strengthen the protection of important data. Further, all regions and ministries will decide specific catalogues of important data within their own regions, departments, and related industries and fields in accordance with relevant regulations. Among others, the DSL states that to carry out data processing activities, it is necessary to establish and improve the security management system which is based on a network multi-level protection scheme to aid in the strengthening of data security protection. The export and security management measures of important data collected and generated by critical information infrastructure operators shall be subjected to the Cybersecurity Law. On the other hand, the the export and security management measures of important data collected and generated by other data processors will be formulated by the National Cyberspace Administration of China in conjunction with relevant ministries of the State Council. For cross-border transfers of data, data processors shall only transfer such local data to a foreign judicial and law enforcement body upon approval of the competent authority, failure to obtain approvals shall result in penalties for the data processors.
On 28 April 2021, the Secretariat of the National Information Security Standardization Technical Committee released the national standard of Safety Requirements for Collecting Data of Connected Vehicles for public consultation. In terms of data transmission, except for specific data, without the consent of the person whose data is collected, connected vehicles shall not transmit data containing personal information to outside of the vehicle through network or physical interface. Connected vehicles shall not transmit the audio, video, image and other data collected in the car cabin and the data processed by them to the outside of the vehicles through the network or physical interface. In terms of data storage, the vehicle location and trajectory-related data collected by the connected vehicles shall not be stored for more than 7 days in the in-vehicle storage device and the telematics service platform (TSP). In terms of exporting data, the data of road, building, terrain, traffic participants and other data collected from the connected vehicles’ external environment through cameras, radars and other sensors, as well as the data related to vehicle location and trajectory, are not allowed to be transferred out of the country. If data such as connected vehicles’ driving status parameters and abnormal warning information need to be exported, the exporter shall comply with relevant national regulations. If a connected vehicle transmits data overseas through encryption, it should provide information such as the data format and encryption method of the transmission, and provide relevant data content as required when the regulatory authority conducts spot checks and verification.
On 8 April 2021, the People’s Bank of China issued the Financial Data Security Data: Life Cycle Security Specification (the Specification), which stipulates the security principles of financial data life cycle, requirements of protection, organizational guarantee, information system operation and maintenance guarantee. The Specification establishes a security framework covering the process of data collection, transmission, storage, use, deletion and destruction, and stipulates that the internal transmission of data that falls at and above a “level three” category shall adopt data encryption, secure transmission channels or secure transmission protocols for data transmission. In principle, data that falls at and above a “level three” category should not be transmitted outside the financial institutions. If the transmission is really necessary, it should be approved and authorized in advance, and technical measures should be taken to ensure confidentiality. The Specification is to provide guidance to financial institutions to carry out electronic data security protection work, and provide a reference for third-party evaluation institutions to carry out data security inspection and evaluation.
5. The Ministry of Industry and Information Technology issued the Interim Provisions on the Protection and Management of Personal Information in Mobile Internet Applications (Draft for Public Consultation)
On 26 April 2021, the Ministry of Industry and Information Technology (MIIT) issued the “Interim Provisions on the Management of Personal Information Protection and Management of Mobile Internet Applications (Draft for Public Consultation)” (the Provisions) for public consultation. The deadline to provide comments was on 26 May 2021. The Provisions aims to strengthen the protection of personal information in mobile Internet applications (APP), regulate personal information processing activities in APPs, and promote the reasonable use of personal information. Personal information processing activities from APPs carried out within the territory of the People’s Republic of China shall comply with these regulations. The key highlights of the Provisions are: (1) defines the scope of application and the subject of supervision, (2)establishes the two important principles of “informed consent” and “minimal necessity”, (3) details main responsibilities and obligations that APP development operators, distribution platforms, third-party service providers, terminal enterprises and network access service providers shall perform in the personal information processing activities in APPs and (4) puts forward four requirements for APPs including complaints and reports, supervision and inspection, handling measures and risk warnings.
On 23 April 2021, the seven national ministries jointly issued the Administrative Measures for Webcast Marketing (on Trial) (the Measures) which came into force on 25 May 2021. The Measures requires that live marketing platforms establish and improve mechanisms and measures on the following: (1) account and live marketing function registration and cancellation, (2) information security management, (3) marketing behaviour regulations, (4) minor protection, (5) consumer rights protection, (6) personal information protection, (7) network and data security management, etc. The live marketing platforms shall take necessary measures to ensure the security of the personal information processed by them. The live marketing platforms shall consolidate the information security management of the links in the live, QR code and other jump services to prevent information security risks.
On 9 April 2021, the National Healthcare Security Administration issued the Notice on Issuing Guiding Opinions on Strengthening Network Security and Data Protection (the Notice), which clearly defines the guiding ideology for strengthening network security and data protection. The Notice puts forward six major measures to strengthen network security management. These include: (1) introducing the main responsibilities of entities in charge of network security, (2) improving the network security supervision and management mechanism, (3) strengthening the security protection of critical information infrastructure, (4) consolidating the capability of protecting the network security by technology, (5) improving capabilities of network security situational awareness, early warning and coordination and (6) improving emergency response capabilities for emergent network security incidents. In terms of the strengthening of data protection, these measures are (A) implementing security management throughout the life cycle of data (B) implementing hierarchical and classified management, (3) strengthening the protection of important data and sensitive fields, (4) strengthening data security approval management, (5) implementing the jurisdiction of data security, (6) promoting safe sharing and use of data; and (7) establishing a sound data security risk assessment mechanism.
On 6 April 2021, the Ministry of Transport issued the Administrative Measures for the Sharing of Transport Data in Government Affairs (the Administrative Measures) to standardize the sharing of transportation data in government affairs. Transportation government data is defined as various non-confidential data, documents, materials, charts, etc. that are collected, generated, obtained, recorded and preserved in electronic form in accordance with the law, directly or through a third party, by government departments in the course of performing their duties. The Administrative Measures has six chapters and 26 articles, and contains the scope of application, the sharing management system and division of responsibilities of government transport data, requirements and procedures of compiling, releasing, updating and managing of catalogues as well as the methods of provision and acquisition of government data.
In April 2021, the National Information Security Standardization Technical Committee issued the “Information Security Technology: Security Evaluation Specification for Mobile Internet Application (APP) Personal Information”, “Information Security Technology: SDK Security Guide for Mobile Internet Application (APP)”, “Information Security Technology: Technical Specifications for Government Network Security Monitoring Platform”, “Information Security Technology: Evaluation Requirements for Information System Password Application”, “Information Security Technology: Evaluation Specification for Personal Information De-identification Effect Classification”, “Information Security Technology: Technical Requirements for Edge Computing Security “, “Information Security Technology: Basic requirements and Guidelines for IPSec VPN Security Access”, “Information Security Technology: Security Requirements for Voiceprint Recognition Data”, “Information Security Technology: Security Requirements for Gait Recognition Data” and “Information Security Technology: Security Requirements for Face Recognition Data.” These ten drafts provide guidance and reference for third-party evaluation agencies, competent regulatory authorities and related operators in related fields.
On 23 April 2021, the MIIT issued a notice of APPs that infringed users’ rights and interests. The MIIT authorised a third-party testing agency to inspect mobile phone application software, and focused on urging games and tools companies with problems to make rectifications. To date, 93 APPs have not completed the rectifications exercise. In the first quarter of 2021, there were issues found in Tencent App Store, Mi App Store, OPPO App Store, Huawei App Store, and Vivo App Store. These issues included laxed shelf reviews, the inventory issues thoroughly cleaned, and the registrations and verifications of the information of APPs’ developers and operators were inaccurate, which misled users to mistakenly download the APP. The MIIT has urged relevant platform companies to carry out comprehensive rectification and strictly monitor compliance.
On 6 April 2021, the MIIT notified 60 apps that had not yet completed rectification in accordance with the Cybersecurity Law and the Interim Provisions on the Management of Pre-setting and Distribution of Mobile Smart Terminal Application Software (MIIT Xinguan  No. 407) and other laws and normative documents. Pursuant to the authority provided under these laws and regulations, the MIIT removed those APPs for failure to rectify its issues. Further, the relevant application stores shall immediately remove the infringing application software in their stores upon publication by MIIT of the notice.
On 16 April 2021, Zhejiang Communications Administration issued a notice of APPs that infringed users’ rights and interests. For the51 APPs that had not completed the rectification measures, these will need to be completed before 25 April. If the rectification measures were not completed within the time limit, the Zhejiang Communications Administration will deal with the APPs in accordance with laws and regulations. Most of the issues involved in the APPs related to failure to express the purpose, method and scope of the collection and use of personal information, the collection of personal information beyond the scope, mandatory, frequent, and excessive requests for permissions, etc.
From 14th to 19th April 2021, Ningbo Market Supervision Administration imposed a fine of RMB 250,000 on three real estate companies (the Parties). The Parties had installed face recognition systems of varying brands at their sales offices. The distributors would report the information of customers introduced by distributors to the parties in advance, and the reported information would be uploaded to the face capture system. The system would automatically store the facial biometric information of all customers who visited the sales office. The manner in which the Parties utilised the face recognition system were as follows: (1) when a customer introduced by a distributor officially signs a sale purchase agreement , the Parties collected the customer’s facial biometric information and ID card information through the face authentication machine, and the system automatically gathered the previously reported information related to the customer and the facial biometric information collected by the visiting sales office under the customer’s name and (2) if the time of the customer’s first visit to the sales office corresponded to the time reported by the distributor, the Parties settled the commission to the distributor accordingly.
On 9 April 2021, the Hangzhou Intermediate People’s Court issued the judgement of the second instance of the service contract dispute between Guo Bing and Hangzhou Wildlife World Co., Ltd (Wildlife World). Further to the original judgment, the second judgement included a judgment to Wildlife World to delete the fingerprint identification information submitted by Guo Bing when he applied for the annual fingerprint card. The background of this case was that on 27 April 2019, Guo Bing had purchased the Wildlife World Double Annual Card and left relevant personal identification information together with entering fingerprints and taking photos. Later, Wildlife World adjusted the entry method by annual cards from fingerprint recognition to face recognition, and sent a text message to Guo Bing to notify him of the change, requesting activation of the face recognition method. The negotiation between the two parties failed, which led to a dispute in this case. The Hangzhou Intermediate Court found that Wildlife World wanted to activate and process the photos it had collected into face recognition information, which exceeded the purpose of prior collection and violated the principle of legitimacy. Therefore, the facial feature information including the photos submitted by Guo Bing when applying for the card should be deleted. Further, inview of the fact that Wildlife World stopped using fingerprint recognition gates, which made it impossible to use the originally agreed method for entering the park, Guo Bing’s fingerprint recognition information should also be deleted.
On 22 April 2021, the Supreme People’s Procuratorate issued a typical case of public interest litigation on the protection of personal information by procuratorial organs. Among the 11 typical cases issued by the Supreme People’s Procuratorate, administrative public interest litigation cases were in relation to supervision of personal information and the disclosure of government information of administrative organs in the fields of education, market supervision, public security, cyberspace, agriculture and countryside, and involved personal information leakage in express delivery, medical institutions, off-campus training institutions, etc. Civil public interest litigation cases were in relation to Internet companies’ illegal collection of personal information, and consumption fraud. Public interest civil cases collateral to criminal proceedings were related to the illegal acquisition and transaction of personal information through different means such as technical software, property services, etc. In addition to fighting against criminal acts that infringe on citizens’ personal information, the procuratorial organ also filed a claim against the network operators as co-defendants and demanded these co-defendants to bear the responsibility for public welfare damage.
On 26 April 2021, Xinhuanet released news on the country’s first case of a telecom operator’s refusal to perform cyber security obligations. The virtual operator Yuante (Beijing) Communication Technology Co., Ltd. (Yuante Company), knowing that Ya Feida Company illegally sold a large number of phone cards and used phone cards to engage in illegal and criminal activities, still provided them with a large number of phone cards, and did not adhere to the regulation requirements of setting up high-level authorities. It provided convenience for various illegal and criminal activities, and was suspected of refusing to perform the obligation of information cyber security management. Its chairman and some senior executives were sentenced to fixed-term imprisonment or detention ranging from one year and four months to one year and ten months by the court of first instance. This is the first national case in which a telecommunications operator in our country has been sentenced for inadequate supervision of the real-name system of mobile phone cards, resulting in serious consequences for telecommunications network fraud.
On 8 April 2021, the Guangzhou Municipal Market Supervision Bureau of Guangdong Province, in conjunction with the Guangzhou Municipal Commerce Bureau, held a special survey called “Taking Advantage of Users Using Acquired Big Data” and an administrative guidance committee for regulating fair competition market order. 10 Internet platform companies including Vipshop, JD.com, Meituan, Ele.me, Missfresh, Fresh Hema, Ctrip, Qunar, ON TIME, Didi reported data usage and management of users, and put forward suggestions on the supervision of data use. The representatives of the platform companies signed the Platform Enterprises’ Commitment to Maintain a Fair Competitive Market Order, and made promises to the society not to illegally collect and use personal information of customers, and not to take advantage of users by using acquired big data.
On 21 April 2021, Tencent released “Tencent privacy computing white paper 2021”, which describes the basic concept, technical system, roles and drawbacks of privacy computing in data security and compliance. Privacy computing is a kind of technology and system that is jointly calculated by two or more participants. The participants cooperate to perform joint machine learning and joint analysis on their data without disclosing their own data. Privacy computing application is conducive to the protection of personal information security, and helps enterprises fulfill their data protection obligations in the process of data cooperation. Privacy computing is expected to become a technical tool for data compliance and privacy protection in the process of data collaboration, but it still needs to clarify the user authorization mechanism and pay attention to data security risks.
On 13 April 2021, the State Administration for Market Regulation, together with the Cyberspace Administration of China, and the State Taxation Administration held an administrative guidance meeting with Internet platform companies. In response to the prominent problems in the field of platform economy, such as the forced implementation of “piking one from two” and other outstanding issues, the meeting put forward “five strict preventions” and “five guarantees”, which clearly imposes requirements for all Internet platform companies to conduct comprehensive self-inspection and self-examination within one month and complete rectification. From April 14th to 16th, the State Administration for Market Regulation announced Promises to Operate in Compliance with Laws and Regulations of Internet platform companies that participated in the meeting. The promises of Internet platform companies include the collection and use of personal information in accordance with the law, the protection of personal information security, and strengthening the reviews of advertising information.
On 2 April 2021, the Dutch Data Protection Agency (DPA) imposed a fine of 475,000 Ruros on Booking.com because it violated the GDPR’s requirement to report data breaches within 72 hours. In December 2018, criminals obtained personal information such as the names, phone numbers, and addresses of more than 4,000 people who booked hotel rooms from Booking.com, as well as information on more than 300 credit cards. Booking.com was informed of the data breach on 13 January 2019, but did not report it to the DPA until 7 February, which was 22 days too late.
On 26 April 2021, Apple’s application tracking transparency framework came into effect. Applications shall ask for the user’s permission in order to track users or access their device’s Identifier for Advertisers (IDFA). When an app wants to follow users’ activities to share information with third parties such as advertisers, a window will appear on our Apple device to ask for users’ permission to do so. If users say no, the app shall stop monitoring and sharing users’ data information. The use of the application tracking transparency mechanism has been questioned. The opposing view is that it may harm the interests of other companies, especially advertising companies. Developers and advertising technology companies may also track users through other techniques.
On 21 April 2021, the European Commission issued the Proposal for a Regulation laying down harmonised rules on artificial intelligence (Proposal). This first of its kind legal framework on AI will guarantee the safety and fundamental rights of people and businesses. The Proposal divides the AI systems into four categories of risk: (1) unacceptable risk, (2) high risk, (3) limited risk, and (4) minimum risk. Among them, AI systems intended to be used for the ‘real-time’ and ‘post’ remote biometric identification of natural persons are deemed as high-risk systems. AI systems with high risk levels will be subject to strict obligations before they can be put on the market, including establishing adequate risk assessment systems and providing detailed documentation containing all information necessary on the system and its purpose for authorities to assess its compliance.
On 18 May 2021, the Department for Digital, Culture, Media and Sport (“DCMS“) released the government’s response (the “Response“) to the consultation on the National Data Strategy (the “Strategy“). The Strategy was released in September 2020 as an attempt to pave the way for ‘unlocking the value’ of data across the economy.
The Strategy has five missions:
- Unlocking the value of data across the economy;
- Maintaining a pro-growth and trusted data regime;
- Transforming the government’s use of data;
- Ensuring the security and resilience of data infrastructure; and
- Championing the international flow of data.
The Strategy is centred around improving data use and availability across the economy in order to enhance innovation and growth. Opening up government datasets is also a key priority and encouraging the free flow of data internationally has been identified as an important objective.
The Response takes stock of the stakeholder responses to the consultation and addresses the steps that the government has already taken to achieve the above missions, and the way forward in implementing the Strategy.
Below, we have outlined the key themes and issues arising from the Strategy and the Response.
Data in scope: Implications for personal data
The Strategy limits its application to digital information about people, things, and systems. This includes personal data, biometrics, demographics, systems and infrastructure data, geospatial data and sensor data relating to the Internet of Things. As a result, the Strategy covers both personal and non-personal data, and the government will have to tread a fine line while developing regulation to enable data use and access given that personal data is subject to much higher standards of protection. The challenges of sharing personal data have been recognised by the government in the Response, which has endeavoured to maintain these high standards of data protection while ensuring that ‘unnecessary barriers’ are not created to responsible data use (although it is not entirely clear as yet how this balancing act will be achieved). It is also worth noting that the Information Commissioner’s Office (“ICO“) has recently released a new Data Sharing Code of Practice which provides helpful guidance to organisations sharing personal data. (See HSF blog here.)
Data is the new sunlight: Encouraging widespread data use
The Strategy emphasises the increasingly common understanding that data is a resource to be harnessed as opposed to a threat to be managed. The government is seeking to optimise the opportunities that arise from data use to power innovation and better service delivery, highlighting that smaller organisations do not have the same access to data as larger technology companies, which potentially limits their ability to innovate and participate in the market. This rationale for data sharing can also be found in the European Data Strategy, released in February 2020, which stated that data should be available to all, be it start-ups or giants. India has also been considering regulation on the sharing of non-personal data on the basis that it would improve innovation.
The government’s vision to share data is not limited only to sharing in the private sector, but also emphasises the importance of the public sector having access to data to improve decision-making and services, for example, infrastructure and housing.
In improving data access, the government seeks to take an evidence-based approach while striking a careful balance in terms of the degree of government intervention, recognising that intellectual property rights sometimes vest in data, and these ought to be protected.
In Europe, the issue of data access has been addressed in the report of the European Commission titled ‘Competition policy in the digital era‘ which looks at the question of when data access is indispensable for a business. This often hinges on whether access to that data is essential for the business to compete. The Strategy and the Response do not look at data access through this kind of competition law lens, however, the Response does state that the government is looking into the importance of data access in enabling market competition and delivering public benefit. The Digital Markets Unit may become a key player in this sphere, as the Response indicates that powers to promote competition and address market power will be devolved to it. (See HSF Digital Regulation Timeline entry on the Digital Markets Unit here.)
In response to the Strategy, respondents were in agreement about the benefits of better data availability but had divergent views on the degree of government intervention required to achieve it. As part of its efforts, the government has been conducting research into the measures to counter the barriers to data availability, such as improving the understanding of data sharing, supporting data foundations (i.e. data that is up to date, recorded in standardised formats, easily accessible and retrievable and protected against unauthorised use), improving incentives for and tackling the risks associated with data sharing, and mandating data sharing in the public interest. The last of these measures may sound alarm bells for organisations concerned about protecting their competitive advantage – data is a valuable resource that organisations invest in maintaining as better data leads to better insights. The DCMS report on Increasing Access to Data Across the Economy (from which the measures to improve data availability are sourced) suggests that mandatory data sharing may however be required where the goal is to increase competition.
It will be interesting to see if and how the government mandates data sharing and how this will be balanced with protecting organisations’ intellectual property rights, which has been a stated objective of the Strategy.
International flows of data: A post-EU outlook
Emphasising the UK’s intention to be a world leader in data flows, the Strategy announced the UK’s ambitions to encourage greater flows of data internationally, and ensure that there are no unnecessary constraints caused due to fragmented national regimes. To this end, the Strategy had the following objectives:
- Securing positive adequacy decisions from the EU to maintain free flows of personal data from the European Economic Area. In February 2021, the European Commission published draft adequacy decisions for transfer of personal data to the UK. This is an important step forward in the UK’s mission to improving international data flows and the UK has now urged the EU to complete the process for adopting and formalising these decisions (although this process could possibly be delayed after the EU Parliament voted in May 2021 to ask the European Commission to modify its draft decisions on whether or not UK data protection is adequate).
- Developing the UK government’s capability to conduct its own data adequacy decisions. In its Response, the government has stated that it will announce its priority countries for data adequacy shortly. Respondents suggested that the US and EU should be prioritised for UK adequacy assessments, and highlighted opportunities for the UK in the Middle East, Africa, the Indian subcontinent and Brazil. The government will also explore alternative transfer mechanisms to provide some flexibility and we could see UK standard contractual clauses being developed and new binding corporate rules being approved. New guidance on international transfers may also be published by the ICO. This would all be in stark contrast to the EU approach, pursuant to which only 12 adequacy decisions have been issued since the Data Protection Directive (the predecessor to the General Data Protection Regulation) in 1995.
- Agreeing ambitious data provisions in trade agreements. The UK intends to use its new independent seat in the World Trade Organisation to influence trade rules on data, and also agree provisions in trade agreements which prevent unjustified data localisation measures and maintain high data protection standards. In the Response, the government announced that it had agreed data flow provisions in trade agreements with the EU and Japan to this end. The government has also secured reciprocal free flows of personal data with the non-EU countries that are recognised by the UK as adequate, such as Japan, Canada, Israel and the Crown Dependencies.
- Driving UK values globally. The Response sets out the government’s intention to “champion the secure, trusted and interoperable exchange of data across borders” and using diplomacy to influence the global position on rules and standards relating to data.
The UK’s opposition to data localisation measures is consistent with the EU position captured in the Regulation on a framework for the free flow of non-personal data in the European Union, which prohibits data localisation measures and enables processing of data in multiple locations throughout the EU. However, some of the countries (such as India and the United Arab Emirates) suggested by respondents as possible priority countries for data adequacy do still incorporate data localisation requirements for certain categories of data. The government’s approach towards the priority countries will be one to watch and data localisation provisions (or the lack thereof) in trade agreements could possibly be hotly contested between parties.
Security and Resilience of Infrastructure: The key to data availability?
In the Strategy, the government sets out the importance of secure and resilient data infrastructure (i.e. systems and services that store, transfer and process data, for example, data centres and cloud computing) characterising it as a “vital national asset” and crystallising its intention to ensure data in transit and data stored in external data centres is sufficiently protected.
As part of its bid to improve security, the Response discussed the National Security and Investment Act (the “NSI Act“) which creates a mandatory notification regime for acquisitions in certain sectors, one of which is data infrastructure. (See HSF blog here).The NSI Act recently received Royal Assent. The operation of the NSI Act is likely to give the government greater oversight over the players in the data infrastructure space and whether acquisitions in the sector are likely to give rise to national security concerns. However, it remains to be seen whether it will create any barriers for investment and innovation in the sector.
In any event, the government’s focus on security of data infrastructure is likely to give its trade partners (as discussed above) some comfort while agreeing to the free flow of data.
Importantly, the government has also flagged the environmental impact of data use, stating in the Response that it will embed sustainability as a key decision point while designing and approving government-owned digital systems and services and use its COP26 presidency to begin an international conversation on the role of data and digital in countering climate change.
Improving the government’s use of data
The experience of managing the pandemic has been a helpful indicator of the usefulness of data sharing between different parts of government in developing a crisis response. Accordingly, another key priority of the Strategy is to improve the way the government uses data across the board. To this end, the government seeks to improve the quality, access and interoperability of data by prioritising the use of the Digital Economy Act (which contains provisions for data sharing between government departments) and the work of the Data Standards Authority on standards for data access. For the latter objective, the government will assess how the FAIR Principles (findable, accessible, interoperable and reusable) can support data management and stewardship, and the TRUST Principles (transparency, responsibility, user focus, sustainability and technology) can be applied to digital repositories.
In the Response, the government has also pledged to increase transparency in algorithmic decision making in government and embed the Data Ethics Framework across government processes.
The government has identified several key priorities in the Strategy and the subsequent Response. Some of these are likely to have a positive impact on the data economy, such as the emphasis on free flows of data internationally, countering the environmental impact of increased data use and improving government use of data. However, it remains to be seen how the government will maintain high data protection standards in the face of widespread data use. Organisations should also keep an eye out for new policy frameworks from the government on mandatory data sharing.
- On 17 December 2020, the Information Commissioner’s Office (“ICO”) published a new Data Sharing Code of Practice (the “Code”). As nearly ten years have passed since the implementation of the previous data sharing code published by the ICO, the new Code has been updated to reflect key changes in data protection laws and the ways in which organisations share and use personal data.
- The Code was then laid before the Parliament on 18 May 2021 and will come into force after 40 sitting days.
- The Code serves to compile all of the practical considerations that organisations need to take into account when sharing personal data with other parties, bringing together existing items of ICO guidance (e.g. in relation to ensuring a legal basis has been satisfied) and supplementing this with new guidance (e.g. in relation to data sharing issues that arise when conducting due diligence in M&A transactions).
- Whilst the Code makes reference to data sharing in the context of new technologies and concepts that were not in existence at the time of the previous data sharing code (e.g. automated decision-making), the Code does not address certain perennial issues in detail, such as the distinction between anonymisation and pseudonymisation and the impact on data sharing.
- Even though the Code does not represent a huge departure from the previous data sharing code and has been somewhat lost in amongst the glut of guidance that has been released by the ICO and EDPB in recent months, it is still a largely helpful piece of guidance that organisations should carefully consider to ensure that they are adhering to its recommendations.
In this article we pull out aspects of the Code that we deem to be noteworthy, such as the practical steps that organisations need to consider when sharing personal data with other parties and our views in relation to whether the Code is fit for a digital age.
Given that the previous data sharing code was published by the ICO almost ten years ago in May 2011, one of the ICO’s key objectives when preparing the new Code was to bring its guidance up-to-date to reflect the current regulatory landscape following the implementation of the General Data Protection Regulation (“GDPR”) and the Data Protection Act 2018 (the “DPA”) (the Code also makes reference to the UK’s exit from the European Union and the EU GDPR being written into UK law through the European Union Withdrawal Act 2018, clarifying that references to the GDPR in the Code should be read as references to the UK GDPR) together with various technologies commonly used by organisations involving personal data e.g. automated decision-making.
Whilst the Code was published on 17 December 2020 and is due to come into force as the end of June 2021, the Information Commissioner has described the publication of the Code as “not a conclusion but as a milestone in this ongoing work” and has already announced plans to update the ICO’s guidance on anonymisation under the Code.
Scope of the Code
Section 121 of the DPA defines data sharing as “the disclosure of personal data by transmission, dissemination or otherwise making it available” and covers data sharing between either separate or joint data controllers and it is this type of data sharing between controllers that the Code focusses on as opposed to data sharing between controllers and processors.
The Code covers two main types of data sharing, namely:
- routine (also called systematic) data sharing which is conducted on a regular basis; and
- exceptional data sharing, which occurs on a one-off basis, either ad hoc or in emergency situations.
Whilst the previous data sharing code addressed exceptional data sharing to a degree, the Code dedicates a new section to data sharing in urgent and emergency situations, emphasising the benefits of data sharing by referring to recent tragedies such as the fire at Grenfell Tower and terrorist attacks in London. The Code also singles out how proportionate, targeted data sharing (e.g. through the NHS Test and Trace system) can make a positive difference in unprecedented emergencies such as the coronavirus pandemic.
The Code addresses the sharing of personal data, including pseudonymised data (distinct from truly anonymised data), defined by Article 4 of the GDPR as “the processing of personal data in such a manner that the personal data can no longer be attributed to a specific data subject without the use of additional information, provided that such additional information is kept separately and is subject to technical and organisational measures to ensure that the personal data are not attributed to an identified or identifiable natural person”. An example of pseudonymised data is where an organisation shares seemingly anonymous data, but the individual can be re-identified with the help of additional information such as internal identifiers (e.g. account numbers) or publically available information (e.g. social media or voter registration records). In such circumstances, the pseudonymised data needs to be treated as personal data in accordance with the Code and the GDPR.
Practical steps for organisations
While much of the Code serves to bring together existing items of ICO guidance that organisations need to comply with when conducting data sharing (e.g. in relation to ensuring a legal basis has been satisfied in relation to the sharing), the Code offers a number of items of new guidance, which we have summarised below:
1. Data Protection Impact Assessments
Organisations should conduct a Data Protection Impact Assessment (“DPIA”) when considering sharing personal data. The aim of a DPIA is to assess the risk of the data sharing and identify where additional safeguards are needed. Carrying out a DPIA is mandatory for data sharing that is likely to result in a high risk to individuals.
However, due to the accountability principle under the GDPR, organisations will still need to demonstrate their compliance with data protection laws in relation to data sharing. As such, even where it is not mandatory to carry out a DPIA in relation to proposed data sharing, the Code recommends that organisations maintain a record of the reasoning as to why a DPIA has not been undertaken and details of the level of expected risk associated with the data sharing.
When faced with an emergency situation which requires data to be shared in a way that is likely to involve a high risk to individuals, the Code recognises that can often be difficult for organisations to conduct DPIAs in advance of such sharing. Instead, the ICO recommends that organisations who are likely to be responding to emergency situations should consider conducting pre-emptive DPIAs in advance where possible.
2. Data Sharing Agreements
The aim of a Data Sharing Agreement (“DSA”) is to establish the particulars regarding a proposed instance of data sharing between two or more controllers, such as the roles of the parties, the purpose for which data is being shared and compliance standards that each of the parties need to meet in relation to the sharing and any subsequent processing of the personal data in clear and concise language.
The Code provides practical examples of what organisations should include in a DSA, e.g. a model form for seeking individuals’ consent for data sharing (if appropriate), a decision flow diagram to assist with deciding whether or not it is appropriate to share data and a process to be followed by the parties when an individual exercises their rights against either or both of the parties.
Although it is only compulsory under the GDPR to put a DSA in place where a joint controller relationship is established between two or more parties, the Code recommends that organisations also put a DSA in place to address data sharing between separate or independent controllers in addition, especially given that the ICO will take into account the existence of a DSA when assessing any issues or complaints arising out of an instance of data sharing between controllers, irrespective of whether they share data jointly or independently/separately.
3. Responsibility of disclosing party for recipient’s processing of personal data
For a long time, the extent to which an independent controller which discloses personal data to another controller bears responsibility for the recipient’s processing of that personal data has been somewhat unclear. The Code attempts to provide clarity in relation to this issue, stating that an organisation cannot provide personal data to another when it has no visibility over the measures that the recipient has implemented to ensure that the personal data is consistently protected at all stages of the data sharing.
This indicates that an independent controller that discloses personal data to another needs to ensure that the recipient is subject to sufficiently robust contractual obligations and standards in relation to its handling of personal data upon receipt and undertake a degree of due diligence in relation to the underlying arrangements, for example in relation to the security measures that the recipient has in place.
4. M&A transactions
When one or more parties are involved in an M&A transaction or restructuring which involves the sharing of personal data and/or a change in the identity of the controller, the parties involved need to ensure that due diligence extends to examining issues pertaining to the transfer/sharing of personal data in connection with the transaction. This should include conducting an analysis of:
- the purposes for which the personal data was originally obtained;
- the lawful bases for the processing of such data;
- the lawful basis for sharing such data with a third party (for example, whether privacy notices made available to individuals at the time their data was collected stated that their data would be shared/sold to a purchasing organisation in case of an acquisition);
- whether, following the acquisition, the purposes for processing is to change (for example, if the selling organisation collected personal data from customers purely to set up an account with them, the buying organisation cannot use this personal data for a different purpose (e.g. research) without carrying out appropriate compliance steps to legitimise this new purpose); and
- whether technical advice is required before sharing data, especially when different systems are involved as there is a potential security risk that the data could be lost, corrupted or degraded.
5. Sharing personal data in databases and lists
The transfer or sharing of any database or list of individuals is also addressed in the Code, which places the onus on any recipient of a database or list of personal data from another party to establish the provenance or integrity of this data and to ensure that all compliance obligations have been met prior to exploiting or otherwise using the data.
The Code makes various recommendations in relation to confirming the source of the data, identifying the lawful basis on which it was obtained, checking what the individuals were told when their data was collected and that the data is accurate and up-to-date.
The Code also refers out to the ICO’s detailed guidance on direct marketing, which indicates that a recipient of a database or list of personal data cannot rely on marketing consents obtained another party to justify its own use of this personal data for direct marketing purposes unless the original consent specifically named the recipient who wishes to rely on the consent.
A code fit for a digital age
As previously mentioned, the ICO’s intention is for the Code to be “up-to-date on current cyber-related privacy issues and to provide a roadmap in anticipating future technological developments”.
The Code seeks to address items such as automated decision-making, which has become more prevalent since the previous code was published and touches on the difference between anonymised data and pseudonymised data.
Article 22 of the GDPR sets out the rules which apply to organisations which carry out automated decision making i.e. a decision made with no human influence on the outcome.
The Code makes it clear that a number of steps need to be taken in relation to any data sharing arrangement involving automated decision-making (e.g. if an organisation were to use an algorithm to determine whether or not an individual’s personal data is shared with a third party recipient) namely that a DPIA must be carried out, all requirements set out in Article 22 need to be met in relation to the processing (e.g. individuals should receive an explanation on their rights to challenge a decision and request human intervention) and measures need to be put in place to prevent errors, bias and discrimination in the system.
The difference between anonymised data and pseudonymised data
The Code distinguishes between anonymised and pseudonymised data, specifying that the Code applies to pseudonymised data (where the individual can be re-identified from data with the use of additional information) but does not extend to truly anonymised data (where the information cannot identify an individual in any way). An example of pseudonymised data would be a spreadsheet containing travel data with the names and addresses of relevant individuals redacted but which could be combined with other data available to the organisation to re-identify the individuals e.g. publicly available information such as social media account details or even an un-redacted version of the spreadsheet stored separately to the redacted version. Conversely, an example of anonymised information would be the publication of data at an aggregated level, which means that the data is stripped of any element that would identify any individuals.
The ICO has recently published a blog post stating that they are gathering insight and feedback over the coming months before publishing further guidance on anonymisation and pseudonymisation.
It will be interesting to see whether or not the updated guidance addresses a number of perennial questions, namely:
- Does the same level of protection apply to pseudonymised data as traditional personal data under the GDPR?
- What steps need to be taken to change the status of pseudonymised data to anonymised data e.g. if an organisation destroys what they consider to be all additional information that would allow them to re-identify individuals in the pseudonymised data before sharing the data with a third party, does this render the data truly anonymised?
- If pseudonymised data is shared with a third party which has no access to the information to re-identify the individuals (which is kept confidentially by the disclosing party only), does the third party still need to treat the data as pseudonymised data if the data is effectively anonymised in the hands of this third party? Further, is the third party responsible for ensuring that the data is kept accurate and up to date when the third party does not have the information to identify the individuals without assistance from the disclosing party?
- What provisions need to be included in a DSA governing the sharing of pseudonymised data?
Whilst the Code is far from revolutionary and does not set out any guidance that deviates substantially from what has gone before, organisations should carefully review their data sharing practices against the Code and keep track of upcoming guidance and resources published by the ICO which relates to data sharing, as well as deadlines for enforcement.
The financial regulators have continued to increase their efforts to develop and protect financial data. The People’s Bank of China released new standards on enhancing the data capability of financial institutions. Further, several banks were penalized for violating data protection rules in relation to processing of personal information.
MIIT has maintained its focus on its push for data protection in mobile apps remain. In addition to drafting a dedicated regulation for data protection for mobile apps, the MIIT and its local branches have run continuous enforcement campaigns against data privacy violations made by mobile app operators.
1. New guidelines issued for financial industry data capacity building
On 9 February, the People’s Bank of China (PBC) issued the Guidelines for Financial Industry Data Capability Building. The Guidelines specify the division of data strategy, data governance, data architecture, data specification, data protection, data quality, data application, and data life cycle management capabilities. The guidelines aim to provide basis for financial institutions to carry out data work, guide financial institutions to strengthen data strategic planning, focus on data governance, and strengthen data security protection.
2. General Requirements for the Safety of Critical Cyber Equipment
On 20 February, the State Administration for Market Regulation and the Standardization Administration approved seven mandatory national standards (including a telecommunications mandatory national standard) and made one amendment to the General Requirements for Safety of Critical Cyber Equipment, which will come into force on 1 August 2021. These requirements (including security function and security protection requirements) serve as important standards for the implementation of the Cybersecurity Law relating to security requirements of critical cyber equipment There are 10 parts to the security function requirements which focuses on ensuring and improving the security technology capabilities of devices. They are, device identification security, redundant backup recovery and abnormal detection, vulnerability and malicious program prevention, pre-installed software start up and update security, user identification and authentication, access control security, log audit security, communication security, data security and password requirements. Separately, security protection requirements focus on standardizing the security capability of critical cyber equipment providers throughout the equipment life cycle.
3. Five draft standards on national information security technology released for public comments
On 3 February, the Secretariat of the National Information Security Standardization Technical Committee (NISSTC) issued two draft standards on instance message service and express logistics for public comments. Further, on 24 February, NISSTC issued three draft standards on online shopping services, internet payment services and online audio and video services, for public comments. This series of standards set requirements for the type, scope, methods, conditions, and data security protection of data collection, storage, use, transfer and delete. They also provide examples of data classification and guidance for the operators to regulate data activities and for supervision authority and third-party assessment agencies to carry out supervision, management and assessments.
4. New rules on app governance to strengthen personal information protection to be published
On 7 February, the Ministry of Industry and Information Technology (MIIT) announced that it has been drafting the interim provisions on personal information protection of apps. The provisions will define the basic principles of informed consent and minimum necessary personal information protection. The principle of informed consent requires that for app-related personal information processing activity, the entities (i.e. entity processing the data) should inform users of the rules of personal information processing in a clear and easy to understand manner, and the user should voluntarily make clear their consent. The minimum necessary principle requires that there shall be clear and reasonable consent during the personal information processing, and it shall not go beyond the scope of users’ consent or unrelated to service scenarios.
1. Second group of apps 2021 declared to be infringing users’ rights released, 11th group in total
On 5 February, MIIT published a notification on apps which violated user rights by the misuse of microphones, address books and photo albums. It noted that 26 apps had failed to take the necessary rectification measures, with the deadline for doing so being 10 February. If rectification is not made within the time limit, MIIT will organize and carry out relevant disposal work in accordance with laws and regulations. The issues with the apps were due to violations of mobile phone personal information, frequent and excessive requests for permissions, making mandatory for users to use the targeted push notification function, and inadequate indication to users of app information on the application distribution platform.
2. 37 apps in violation of user rights were removed from the app store
On 3 February, MIIT announced that it had removed 37 apps from the app store that violated user rights and failed to take necessary rectification measures. The removed apps collect personal information beyond the necessary scope and were involved in other issues that violated user rights. To recap, MIIT has carried out special rectification actions for two consecutive years against apps that illegally handled users’ personal information. In addition, MIIT also announced that it will strengthen rectification efforts by promoting the development of relevant standards, and actively applying new technologies such as artificial intelligence and big data to promote the construction of a national app technology testing platform.
3. Guangdong Communications Administration ordered to rectify 215 apps infringing users’ rights
4. Two financial institutions fined for illegal processing of personal information
On 2 February, according to the administrative penalty information form released by the business management department of PBC, Beijing Guoxu Small Loan Co., Ltd. was fined 160,000 yuan for dislcosing personal information without notifying the data subject. . Further, Xinhan Bank (China) Co., Ltd. was fined 570,000 yuan for inquiring about personal credit information without consent, and the relevant person in charge was also fined 114,000 yuan.
5. ICBC Liaocheng branch was fined 36,000 yuan for data breach
On 18 February, according to the announcement of the PBC Liaocheng branch, Liaocheng branch of ICBC was fined 36,000 yuan for inquiring about personal information without the consent of the data subject. Wang Hongqing, the general manager of bank card center, the person in charge, was also fined 8,000 yuan.
6. Liaoning Branch of Bank of China was fined for failing to collect and use consumers’ personal financial information as required
On 3 February, the administrative penalty information published by the Shenyang Branch of PBC showed that the Liaoning Branch of the Bank of China which had five counts on data protection violations, was fined 1.147 million yuan. . The violations included, among other things, failure to collect and use consumer personal financial information as required.
7. Qianbao Pay was punished for failing to keep customer identity information as required
On 24 February, according to the administrative penalty information publicity form published by the Chongqing Business Management Department of PBC, Chongqing Qianbao Technology Service Co., Ltd. which had 10 counts of data protection violations was fined 8.68 million yuan. These violations included failure to keep customer identity information as required. The company’s deputy general manager and chief compliance officer, and other five relevant persons were also jointly fined, ranging from a warning to a fine between50,000 to 135,000 yuan. The company’s violations in personal information protection and data security related to them in the midst of ensuring consistency of transaction information in the whole payment process, they had failed to perform the customer identification obligations and retain the required customer’s identity.
8. Maimai was convicted of infringement for sending text messages to unregistered users
On 7 February, Beijing Haidian District Court announced the judgment of Maimai’s infringement of data privacy. In brief, it was found that the Maimai’s website operated by Beijing Taoyou Tianxia Technology Development Co., Ltd., had sent text messages to users in the name of a friend without the user’s permission. It disclosed the user’s real name, and included a message that certain former colleagues have identified the user and many friends are waiting for them to join via a link .When the user clicks the link, the webpage will direct them to the registration page of Maimai’s website. The user subsequently sued Maimai at court by claiming for specific performance including for the website to cease the infringement of his privacy, permanently deleting his personal information, and publishing an apology statement on China Consumer News. The Beijing Haidian District Court found that the defendant’s actions illegally obtained and retained the plaintiff’s personal information such as mobile phone contact information, personal information of the plaintiff’s friend and resume. Further, Maimai had sent unsolicited messages for commercial gain to the plaintiff without consent, which disturbed the plaintiff’s right of peace and privacy. The judgment awarded all the claims of the plaintiff.
1. The National Information Security Standardization Technical Committee released the key action pointsfor 2021
On 25 February, the National Information Security Standardization Technical Committee released the key action points for in 2021, covering seven categories including focusing on the urgent need of national network security work and improving the effective supply of standards. The document points out that it will further develop national standards for network security in the fields of industrial Internet, blockchain, artificial intelligence and algorithms, Internet of things and digital currency, prepare white papers or research reports on network security standardization such as 5G security, face recognition security and network security talents, as well as practical guidelines for data classification and classification and data sharing security.
2. The National Equity Exchange and Quotations Company participated in the 11th joint emergency drill on network security
On 27 February, according to the Circular of the China Securities Regulatory Commission on the 11th joint emergency drill on network security of securities and futures industry, the National Equity Exchange and Quotations Company participated in the joint emergency drill on network security. Other participants included China Securities Depository and Clearing Corporation Limited, Shenzhen Securities Communication Co., Ltd., China Securities Index Co., Ltd. and other host securities companies.
1. EDPB held the 45th plenary session and adopted a wide range of documents
2. EDPS published Opinions on the Digital Services Act and the Digital Markets Act
On 10 February, the European Data Protection Supervisor (EDPS) published Opinions on the Digital Services Act and the Digital Markets Act. It aims to protect individuals’ fundamental rights, including the data protection. For the Digital Services Act, EDPS recommended additional measures to better protect individuals in relation to content moderation, online targeted advertising and recommender systems used by online platforms, such as social media and marketplaces. For the Digital Markets Act, it recommended regulating large online platforms, to promote fair and open digital markets and the fair processing of personal data, to foster competitive digital markets to provide individuals additional choices..
3. German adopted draft law on data protection in telecommunications and telemedia
On 10 February, German Federal Cabinet adopted the draft law on data protection and the privacy protection in telecommunications and telemedia. It plans to replace the existing provisions of the Telecommunications Act 2004 and the Telemedia Act 2007, and implement the Directive on Privacy and Electronic Communications (2002/58/EC). The draft includes provisions on the confidentiality of communications, location data, caller ID display and suppression, end-user directories, technical and organisational precautions, consent for storage of information in terminal equipment, and penalties.
4. Vietnam released the Draft Decree on Personal Data Protection for public comments
On 9 February, the Ministry of Public Security (MPS) of Vietnam released the second version of the Draft Decree on Personal Data Protection. It plans to set more robust rules and provide provisions on data subjects’ specific rights, cross-border transfer of data, and processing of sensitive personal data. Violation may cause temporary suspension of operation, revocation of permission for cross-border data transfer and monetary fines.
5. Virginia passed the Consumer Data Protection Act
On 2 March, the Virginia Consumer Data Protection Act (CDPA) was signed by the governor and will come into effect on 1 January 2023. The CDPA establishes rights for Virginia consumers to control how companies use individuals’ personal data. It stipulates that companies shall protect personal data in their possession and respond to consumers exercising their rights.
6. Danish Data Protection Authorities published Quickguide for setting cookies
7. UK ICO published Toolkit for data analytics
On 17 February, the Information Commissioner’s Office of UK (ICO) published Toolkit for organisations considering using data analytics. It aims to help recognise risks to individuals’ rights and freedoms created by the use of data analytics, from the beginning of data analytics project lifecycle. The Toolkit begins by asking questions to determine the legal regime, including lawfulness, accountability and governance, the data protection principles, and data subject rights. It will then produce a report containing tailored advice for the specific data analytics project.