UK SWITCHES TO DECENTRALISED APPROACH TO CONTACT TRACING APP

In a move that marks a major U-turn for the Government, the UK’s proposals for a centralised contact tracing app have been abandoned in favour of a decentralised model. The new model is based on technology developed by Apple and Google and replaces the original app designed by NHSX, which recently has faced criticism due to privacy concerns as well as technical issues and delays.

The UK follows Germany and Italy, who have already made the switch from centralised contact tracing apps to decentralised models. The UK’s health secretary, Matt Hancock, confirmed the news at the UK Government press conference last night.

To centralise or decentralise?

The UK Government had previously asserted the superiority of a centralised contact tracing model, but what exactly is the difference?

A ‘decentralised’ data model requires individual users to provide an anonymous ID to a centralised server. The user’s phone then downloads information from the centralised database and carries out contact matching and risk analysis on the phone itself before sending alerts to other users if necessary. Information on whether a user has come into contact with an infected person will be shared with that user, but not with the central server.

In contrast, a ‘centralised’ data model would require users to provide not only their own anonymous ID to a centralised database, but also to send any codes collected from other phones. The computer server then carries out contact matching and risk analysis using that information, making the decision as to whether someone is ‘at risk’ and sending alerts accordingly.

The UK’s previous preference for the centralised model was based on the belief that storing data in a centralised manner would promote a more considered approach to contact tracing based on risk factors, and would enable epidemiologists to use valuable data on the spread of the virus for further research. However, the centralised model was criticised for potentially encroaching on privacy by using more data than necessary, and using the data for purposes other than contact tracing.

What next?

NHSX, the health service’s innovation arm, has confirmed that its current leaders will step back from the project, and that Simon Thompson, current chief product manager at Ocado, will take over management of the new app.

While this move will be welcome to privacy campaigners and critics of the centralised model, concerns over the limitations of Bluetooth-enabled technology, as well as the uneasiness over allowing Apple and Google to control the UK’s response to the pandemic, may cause further obstructions to the eventual rollout of a UK-wide contact tracing app. The additional delays resulting from this change in approach may also result in a lower than ideal take-up rate, with much of the population of the view that the time for contact tracing has passed given the current downwards curve of the pandemic.

Miriam Everett

Miriam Everett
Partner, Head of Data Protection and Privacy, London
+44 20 7466 2378

Hannah Brown

Hannah Brown
Associate, Digital TMT, Sourcing and Data, London
+44 20 7466 2677

Katie Collins

Katie Collins
Trainee Solicitor, London
+44 20 7466 2117

Morrisons wins Supreme Court appeal against finding of vicarious liability in data breach class action

Today the Supreme Court handed down its decision in Wm Morrisons Supermarkets Plc v Various Claimants [2020] UKSC 12, bringing to its conclusion a case which had the potential to alter significantly the data protection and cyber security litigation and class action landscape.

The headline news is that Morrisons has been found not to be vicariously liable for the actions of a rogue employee in leaking employee data to a publicly available file-sharing website.

The judgment will likely result in a collective sigh of relief for organisations who have been watching closely to track their potential liability for data breach class actions. However, it is important to note that the Morrisons case and judgment is very fact specific; it does not close the door on data breach class action compensation as a whole. Boardrooms should still be examining the technical and organisational measures they have in place to prevent personal data breaches in order to reduce the risk of regulatory enforcement and class actions.

Background

In 2015 a former Morrisons employee was found guilty of stealing and unlawfully sharing the personal data (including names, addresses, bank account details, salary and national insurance details) of almost 100,000 of Morrisons’ employees with members of the press and with data sharing websites. At the time, the ICO investigated and found no enforcement action was required with respect to Morrisons’ compliance with the Data Protection Act 1998 (“DPA”).

Nevertheless, around 5,000 Morrisons employees brought a claim for damages – irrespective of the fact that they had not suffered any financial loss. Instead, the employees claimed that Morrisons was vicariously liable for the criminal acts of its employee and for the resulting distress caused to the relevant employees.

For full details of the background to the High Court and Court of Appeal arguments, please see our previous Morrisons data protection briefing. Following the conclusion of the Court of Appeal case, the Supreme Court subsequently granted leave to appeal, which led to a two day hearing in November 2019 and, ultimately, the judgment handed down today.

Decision

The Supreme Court overturned the Court of Appeal’s decision, finding that Morrisons was not vicariously liable for the employee’s unlawful actions. Lord Reed gave the judgment of the court, with which Lady Hale, Lord Kerr, Lord Hodge and Lord Lloyd-Jones agreed.

Lord Reed concluded that the courts below had misunderstood the principles of vicarious liability, and in particular had misinterpreted the Supreme Court’s judgment on vicarious liability in Mohamud v WM Morrison Supermarkets plc [2016] UKSC 11. That judgment was not intended to change the law on vicarious liability, but rather to follow existing authority including, importantly, Dubai Aluminium Co Ltd v Salaam [2002] UKHL 48.

In Dubai Aluminium, Lord Nicholls identified the general principle that applies when the court considers the question of vicarious liability arising out of an employment relationship: the wrongful conduct must be so closely connected with acts the employee was authorised to do that, for the purposes of the liability of the employer to third parties, it may fairly and properly be regarded as done by the employee while acting in the ordinary course of his employment.

Applying that test, vicarious liability was not established in this case. The Supreme Court found that the employee’s wrongful conduct was not so closely connected with the acts he was authorised to do that, for the purposes of assessing Morrisons’ liability, it could fairly and properly be regarded as being done in the ordinary course of his employment.

Lord Reed referred to the distinction drawn by Lord Nicholls in Dubai Aluminium between cases “where the employee was engaged, however misguidedly, in furthering his employer’s business, and cases where the employee is engaged solely in pursuing his own interests: on a ‘frolic of his own’, in the language of the time-honoured catch phrase.”

In the present case, he said, it was clear that the employee was not engaged in furthering Morrisons’ business when he committed the wrongdoing in question. On the contrary he was pursuing a personal vendetta against the company, seeking revenge for disciplinary proceedings some months earlier. In those circumstances, the close connection test was not met.

Although it did not affect the outcome, in light of the court’s findings, Lord Reed also considered Morrisons’ contention that the DPA excludes the imposition of vicarious liability in relation to data breaches under that Act and for the misuse of private information or breach of confidence – in effect that the DPA is a statutory scheme which “owns the field” in this respect. The Supreme Court rejected this argument, stating that: “the imposition of a statutory liability upon a data controller is not inconsistent with the imposition of a common law vicarious liability upon his employer, either for the breaches of duties imposed by the DPA, or for breaches of duties arising under the common law or in equity.”

Implications of the decision

Data privacy implications

Although this case was argued under the repealed Data Protection Act 1998, it will likely result in a collective sigh of relief for organisations now subject to the GDPR which expressly allows individuals to claim compensation for non-material damages (including distress) caused by non-compliance with the regulation. This is exactly the decision that many organisations wanted. In a world where companies can already be fined up to EUR 20 million or 4% of annual worldwide turnover for non-compliance with the GDPR, there are real fears concerning the potential for additional significant liability under class action claims for data breaches. Many organisations will be comforted by the steps that the Court has now taken to reduce the likelihood of such claims being successful.

However, it is important to caution against too much optimism. The Morrisons case was quite unique because the compensation claim was brought by individuals despite no regulatory action being taken by the ICO at the time of the data leak itself. The decision is no guarantee that similar claims would fail in circumstances where the regulator agrees that there has been a breach of the security requirements under the GDPR, such as has been the case when you look at some of the recent big data breaches we have seen which are starting to result in significant fines from the ICO.

Despite the claim not being successful in this instance, another, perhaps unintended, consequence of the case is that employers will be seriously considering what steps can to taken to mitigate against the risk of rogue employees leaking personal data. This may result in increased employee monitoring in the workplace, with all the data privacy implications that that may entail.

Employment implications

The “close connection” test for rendering an employer vicariously liable for an employee’s actions (as described above) is well established. What has been less clear is how broadly that test should be applied to the facts. For example, it was suggested that the Supreme Court’s ruling in Mohamud v WM Morrison Supermarkets meant that, where an employee’s role involves interacting with customers in some way, an employer might be vicariously liable for the employee’s conduct towards customers even if the employee engages in a wholly different nature of interaction from that envisaged (such as by using force or away from the usual work station) and regardless of motive.

Given there is no ‘reasonable steps’ defence against vicarious liability for torts, employers will welcome today’s ruling which rows back from that liberal interpretation of the test. The Supreme Court has made clear that the mere fact that the job provides the employee with “the opportunity to commit the wrongful act” is not sufficient to impose vicarious liability, nor is the fact that the wrongful act was the culmination of an unbroken temporal or causal chain of events regardless of the employee’s motive. Doing acts of the same kind as those which it is within the employee’s authority to do is not sufficient either.

The test is whether the wrongful act was sufficiently closely connected with what the employee was authorised to do. In Mohamud it was key that the employee was purporting to act on his employer’s business, threatening the customer not to return to the employer’s premises, and not acting to achieve some personal objective. In contrast, in the current case “the employee was not engaged in furthering his employer’s business when he committed the wrongdoing in question. On the contrary, he was pursuing a personal vendetta, seeking vengeance for the disciplinary proceedings some months earlier.” The ruling helpfully re-establishes that employers should not be liable for the acts of employees engaged on “frolics” of their own, pursuing their own objectives.

Cyber and data security implications

While the spectre of no fault liability presented by Morrisons has fallen away, there is still a significant risk from fault based claims. ICO is imposing substantial fines upon organisations for inadequate technical and organisational security measures. Claimants are cutting and pasting adverse findings from the ICO into claim forms. Organisations will have to make sure that the measures they are taking are appropriate, which will involve considering many factors including state of the art and the harm that may be caused if the security measures fail.

Class actions implications

Data breach class actions are on the rise in the UK and today’s judgment should be seen as a setback not a roadblock. Funders and claimant firms are looking to build class actions in relation to data breaches even where there is no specific evidence of individual damage. They are seeking damages for the whole class for “distress” or a standardised claim of loss of access to data and even a nominal damages award per claimant could lead to a significant amount over a class of tens or hundreds of thousands. Today’s judgment will not reverse that trend, but it will at least mean that companies who are themselves victims of data breaches by employees will not also face such claims on this basis alone.

The key question for the viability of those claims will be how much a bare data breach is “worth” by way of damages, even if there’s no other loss suffered by the victim. We will have to wait a bit longer now to find that out. The principles applied in misuse of private information cases may be helpful to the courts in considering this issue, given how little case law there is on damages in data protection claims.

Insurance implications

The judgment is good news for corporates and their insurers. The expectation of the courts below had been that insurance was the answer to the point that the judgment effectively helps achieve the rogue employee’s aim – namely to harm Morrisons. Insurers may therefore also be breathing a sigh of relief – but only up to a point. Vicarious liabilities for data breaches by rogue employees are insurable in principle, but these claims are not doomsday for the insurance market. That’s because the main risk for corporates – and therefore insurers – is direct liability claims and related losses, which continue apace on an upwards trajectory.

The good news for concerned corporates is that they can buy cyber insurance to cover data breach claims, whether for direct or vicarious liabilities, as well related losses such as costs of managing the incident, regulatory investigations and loss of profits if systems are impacted. However, risk transfer strategies within corporates vary, and that cover cannot necessarily be banked upon in all cases. The main challenge therefore remains – and is not answered here: how much cover would I need to buy for a reasonable worst case, and is that available at reasonable cost on a good wording. Given that the measure of damages is still unclear, this issue will continue to be wrestled with.

Miriam Everett

Miriam Everett
Partner, Head of Data Protection and Privacy, London
+44 20 7466 2378

Tim Leaver

Tim Leaver
Partner, Employment, Pensions & Incentives, London
+44 20 7466 2305

Julian Copeman

Julian Copeman
Partner, Disputes, London
+44 20 7466 2168

Greig Anderson

Greig Anderson
Partner, Disputes, London
+44 20 7466 2229

Andrew Moir

Andrew Moir
Partner, Global Head of Cyber Security, London
+44 20 7466 2773

Kate Macmillan

Kate Macmillan
Consultant, Disputes, London
+44 20 7466 3737

Lauren Hudson

Lauren Hudson
Associate, Digital TMT & Data, London
+44 20 7466 2483

Anna Henderson

Anna Henderson
Professional Support Consultant, Employment, Pensions & Incentives, London
+44 20 7466 2819

Maura McIntosh

Maura McIntosh
Professional Support Consultant, Disputes, London
+44 20 7466 2608

COVID-19 People: Data comes to the fore as outbreak continues (UK)

The COVID-19 outbreak is proving an interesting time to be a data protection practitioner. There seems to be a new article each day about the next exciting app which promises to use data to help manage the crisis.

This post focuses on two particular propositions that pose interesting data protection considerations. It also flags the wider issues that developers should bear in mind when trying to respond to this unprecedented crisis.

Contact Tracing

It was reported on 31 March 2020 that the UK government is actively set to develop some form of contact tracing app in the near future. This follows successful app-based contact tracing in Singapore and South Korea. Led by NHSX, the innovation arm of the NHS, the app will leverage Bluetooth to identify individuals who have been in close proximity to each other, storing a record of that contact, and providing a mechanism through which an individual can be notified if they have been in close proximity to someone that tested positive for COVID-19. Given the anticipated use of Bluetooth, it is possible that NHSX may leverage Singapore’s TraceTogether app which used the same technology, the code for which was open-sourced by the Singapore government last week. TraceTogether was widely praised for collecting the bare minimum of data despite the extraordinary circumstances at hand.

The success of any tracing app will depend on a critical mass of users downloading it. Concerns are already being raised about whether private entities might require either employees or customers to use the app, to show they have not been in contact with infected individuals. It will also depend on a comprehensive testing regime to ensure that those who are symptomatic are tested quickly so that the notification can be sent appropriately quickly. Similarly, swift testing may help avoid people being unduly required to quarantine themselves having been in contact with someone with minor symptoms which do not turn out to be COVID-19.

It is interesting to note that initial statements from NHSX suggest that contacts will be stored on users’ phones, with notifications sent via the app after a suitable delay to avoid identification of the infected individual. It is not currently intended that the data would be sent regularly to a central authority, which may give comfort to people concerned about their privacy. Additionally, NHSX has indicated that it intends to appoint an ethics board to oversee this project.

COVID Symptom Tracker

ZOE, a health and data science company, in conjunction with Tim Spector, a genetic epidemiology professor at Kings College London, have created an app called ‘COVID Symptom Tracker’ that allows users to self-report potential symptoms of COVID-19, even if feeling well. The aim is to use this data to track the progression of the virus in the UK, and potentially identify high risk areas.

At the time of writing the app has been downloaded over 1.5 million times and is listed in Apple’s top 10 free apps in the App Store. The app requires individuals to provide data including age, sex at birth, height, weight, postcode of residence, pre-existing health conditions, and habits such as smoking. Each day, users then report how they are feeling against a list of known symptoms. It appears from the app’s privacy policy that unanonymised personal data may be shared with the NHS or King’s College London, whilst data shared with other entities is given an anonymous identifier.

The app is based on consent, both to the data processing and to potential transfers of personal data to the US. Data is collected for the following purposes related to COVID-19 including: (i) better understanding the symptoms; (ii) tracking the spread of the virus; (iii) advancing scientific research into links between patient health and their response to infection with the virus; and (iv) potentially to help the NHS support sick individuals. Whilst at an initial glance this seems like a reasonably narrow set of processing purposes, you could envisage a surprisingly broad range of activities which might fall within these categories, including specifically tracking individuals.

Data protection considerations

When it comes to processing personal data, the post-GDPR mantra is increasingly ‘Just because you can, doesn’t mean you should’. The principles of fairness, transparency, purpose limitation and data minimisation in particular will require serious consideration to ensure that the proposed data usage is justifiable.

Whilst the Secretary of State for Health & Social Care Matt Hancock recently tweeted that “the GDPR does not inhibit use of data for coronavirus response”, this may not necessarily be aligned with the ICO position that the GDPR is still in full force, despite the fact that the ICO may take a pragmatic approach where necessary. There are certainly lawful routes to using personal data to fight COVID-19, but this should be done based on clear reasoning and analysis.

With that in mind, the following key considerations may assist when evaluating whether or not to use personal data in the context of COVID-19:

  • be confident that you have an appropriate lawful basis for processing the personal data. Remember that both vital interests and substantial public interest are very high bars to satisfy. Likewise, legitimate interests always needs to be balanced against any potential impact on individuals’ rights and freedoms;
  • do not use personal data for extraneous purposes. You should aim to keep your processing purposes as narrow as possible for the stated aims, and be conscious that any attempt to use the dataset for non COVID-19 related reasons might be seen as acting in bad faith. Similarly, the collected data should be limited to what is strictly necessary for the processing purposes. Avoid the temptation to collect additional categories of personal data because they ‘may’ be useful in future;
  • the potential volume of data processing, and categories of personal data being anticipated, suggest that in relation to many of the COVID-19 related apps a data privacy impact assessment should be undertaken. These should be completed carefully and not rushed for the sake of getting an app into the live environment;
  • consider who personal data is shared with, and whether sharing a full dataset is strictly necessary. It may be possible to anonymise personal data such that the recipient only receives fully anonymised data, which may help manage data subject concerns about where their personal data might go. Remember however that true anonymisation is difficult and the pseudonymisation alone does not take data outside of the scope of the GDPR;
  • given the potentially higher risk processing that is taking place, it is important that data subjects understand how their personal data will be used, and who it may be shared with, particularly where they are giving up unusual freedoms such as in the context of tracking. Data controllers should aim to go above and beyond to ensure their fair processing information is clear and easy to understand, so that individuals have good expectations of how their data will be used;
  • if and when relying on data subject consent for any processing, it is likewise important to ensure that the individuals understand exactly what they are consenting to. Now more than ever it is vital that consent is specific, freely given, informed and explicit when dealing with sensitive health data;
  • personal data collected in the context of COVID-19 is generally required for the specific aim of managing the outbreak of the virus or its effects. This may mean that it is not necessary or appropriate to retain this personal data once the virus has been controlled and life returns to normal, depending on what has been communicated to data subjects; and
  • holding larger volumes of personal data, or special category data, potentially represents a higher security risk and there may be increased cyber attacks on the dataset. Ensure that you have appropriate additional security measures in place where necessary.
Miriam Everett

Miriam Everett
Partner, Head of Data Protection and Privacy, London
+44 20 7466 2378

Hannah Brown

Hannah Brown
Associate, Digital TMT, Sourcing and Data, London
+44 20 7466 2677

COVID-19: WHEN PUBLIC HEALTH AND PRIVACY COLLIDE?

Summary

  • Governments and public authorities globally are requiring increased access to personal data of citizens in order to attempt to control and monitor the current spread of COVID-19.
  • The pandemic is generally recognised by data protection authorities as giving rise to extraordinary circumstances, although in Europe at least there are still requirements for processing to be necessary and proportionate, and for personal data to be adequately protected.
  • Governmental responses around the world appear to be in some instances creating a tension between public health on the one hand and privacy on the other, highlighting a new and possibly unexpected consequence of current unprecedented times. When the crisis is over, nation state approaches to privacy may need to be reconsidered and re-evaluated.

The Pan-European Perspective: the EDPB statement

On 16 March 2020 the European Data Protection Board (“EDPB”) released a statement on measures taken to contain and mitigate COVID-19. The EDPB stated that data protection rules “do not hinder measures taken in the fight against the coronavirus pandemic”. This includes the General Data Protection Regulation (“GDPR”) which enables personal data processing without obtaining consent where it is in the public interest or the vital interests of any natural person.

The EDPB recognised that “even in these exceptional times” every controller must ensure that personal data is protected, and that the lawful processing of personal data must be guaranteed. All processing must be in the public interest and must be proportionate to the legitimate aim pursued. Further, the general principles surrounding data processing including transparency still apply, except where necessary and proportionate for reasons of national security.

Tension between public health and privacy?

Around the world, governments have attempted to control the pandemic by harnessing new technology and its power to collect and analyse increasingly large amounts of data, including personal data, which are generated on a daily basis in our societies. Whilst it is understandable that governments are seeking to use all means at their disposal in order to control the pandemic, there is a natural tension between the use of this data and the protection of personal privacy rights.  Globally, a wide range of approaches have been taken and variety of statements made which either re-affirm a commitment to data privacy, or conversely, appear to back-track on previous approaches.

Some examples of this tension currently being reported on the global stage in response to the COVID-19 crisis include:

  • The approval of emergency measures in Israel which allow the use of technology developed for counterterrorism purposes to track infected persons by monitoring their mobile phones. This monitoring technique could be used to notify people who have come into contact with infected persons or enforce quarantine orders.
  • The use of facial recognition and thermal scanning technology, in combination with passenger rules requiring people using public transport to use their real names in China. Data is shared with the police and with media outlets who report on patients’ travel history, which could include where they sat on a train or which compartment they boarded on the subway.
  • The GPS tracking app and SMS alert system used in South Korea where public authorities send text messages detailing the age, gender and recent movements of anyone recently diagnosed. This approach caused issues when speculation on the whereabouts of various infected persons broke out online.
  • The extensive powers put in place by the data protection authority in Italy until at least 30 July which allow civil protection personnel to process data including special category data and communications between employees.
  • The rules put in place by the US Department of Health and Human Services in the USA requiring airlines to collect and provide extensive data for passengers on certain flights.

Contact tracing

Contact tracing refers to the way that governments identify and monitor infected persons, which often means collecting location data. In the EU, national laws implementing the ePrivacy Directive only allow use of location data when made anonymous. Emergency legislation can be passed but only where ‘necessary, appropriate and proportionate within a democratic society’, and where adequate safeguards are put in place. In this respect, it will be interesting to monitor the responses of the various Member States’ governments to see if any such legislation is passed.

A number of countries worldwide are using location data to track and monitor anyone infected by the virus and those they have been in contact with. The USA, Iran, Singapore, Taiwan and Israel have all collected data from third parties or government-mandated apps which collect location data directly.

In addition, authorities in Germany, Ireland and Canada have indicated that they would be open to collecting and processing location data. When asked about the use of cell phone data in contact tracing investigations in Canada, the Ontario Premier commented that “everything’s on the table right now”. Ontario’s Information and Privacy Commissioner stated in response that they would not challenge such a decision as long as any measures were correlative to the outbreak, as public health officials had the power to ‘take extraordinary steps to keep the public safe’.

Limitations on data processing

Some public authorities are clear that powers to process personal data even as a response to the outbreak should not be unlimited. Data protection authorities in Ireland, France and Argentina have all released statements to the effect that public health authorities are entitled to collect and process health data without consent, but have stated that any measures taken must be necessary and proportionate and must not go beyond the management of suspected exposure to the virus. In particular, the Data Protection Commission in Ireland has made it clear that organisations must have regard to principles of transparency, security and accountability, and that only the minimum amount of data necessary should be processed.

Further, the Information Commissioner’s Office (the “ICO”) in the UK released a statement that, while data protection laws would not prevent data from being shared as a result of the pandemic and that the ICO recognises the “unprecedented challenge” of Coronavirus, any excessive or unlawful data processing will still be prohibited.  However, importantly, the ICO appears to have taken a proportionate response, accepting that companies are dealing with an unprecedented event, and that whilst they cannot extend timescales which are enshrined in law, that the ICO will use its own channels to manage data subjects expectations and will take a measured and proportionate response in terms of any investigations.  Finally, the ICO reiterates that, even though companies’ working styles are changing and more and more employees will be working from home, that companies must still have regard to their technical and organisational security measures in order to protect personal data.

A gateway to increased sharing of personal data?

Another consequence of the outbreak is that many government authorities have required private companies to share personal data originally collected for commercial purposes. Officials in Singapore have requested location data from airlines, taxi companies and ride-sharing apps such as Grab, while the USA have pressed airlines and hotels to provide extensive customer data even where Airlines for America have maintained that the requirements are “beyond the capabilities of airlines”.

What next?

It certainly seems that COVID-19 is having and will continue to have an interesting relationship with global data protection legislation and the right to privacy enshrined (often recently) in many laws around the world. Whilst it does not yet appear that data subjects are raising challenges to the governmental responses, it is important to note that many regulators, particularly in Europe, are reiterating the need to keep data protection in mind when considering responses to this pandemic and, when the dust of the current crisis has settled, the impact on data protection and privacy may be an interesting and unintended consequence of today’s unprecedented events.

Miriam Everett

Miriam Everett
Partner, Head of Data Protection and Privacy, London
+44 20 7466 2378

Lauren Hudson

Lauren Hudson
Associate, London
+44 20 7466 2483

Katie Collins

Katie Collins
Trainee Solicitor, London
+44 20 7466 2117

The Encryption debate is far from ‘going dark’

Shortly after the release of the communiqué from the most recent ministerial meetings of the ‘Five Countries’ security alliance — Australia, Canada, New Zealand, the UK and the US — at the end of July, we warned that the issue of the use of, and access to, encrypted services and technologies ‘remains front of mind for the alliance and further legislative or regulatory action in the Five Countries may follow’.

This week, It became clear that three of the Five Countries planned to follow through. On 4 October 2019, representatives of the Australian, UK and US governments planned to release:

Continue reading

Driving Data Compliance

Connected autonomous vehicles (CAVs) are increasingly capable of creating, collecting and processing a wealth of data. However, in order for vehicle manufacturers and CAV stakeholders to access and extract the value in such data, they must do so lawfully. This is especially true in relation to personal data which is governed in the EU (and beyond) by the General Data Protection Regulation (GDPR). This post explores at a high level how CAV stakeholders can ensure compliance with the GDPR, particularly in relation to CAVs which process personal data of vehicle drivers, owners and pedestrians. Continue reading

ICO framework for AI proposed to help support innovative use of this emerging technology

At the end of March the Information Commissioner’s Office (ICO) published an outline of the proposed structure for its auditing framework for the use of personal data in an Artificial Intelligence (AI) context. Once finalised the framework has potential to help catalyse the use of this new emerging technology within the restrictions of data protection regulation. In particular, it is intended to support the ICO in assessing data controller compliance, as well as providing data protection and risk management guidance, in relation to AI. Continue reading