The UK Government recently published its long-awaited response to its data reform consultation, ‘Data: A new direction’ (see our post on the consultation, here).

As many readers are aware, following Brexit, the UK Government has to walk a fine line between trying to reduce the compliance burden on organizations and retaining the ‘adequacy’ status that the European Commission granted in 2021 (see our post on the decision, here).

While we’ll have to wait to review the detail of the final legislation, we outline below some of the more eye-catching proposals for reform.

1. Scrapping certain cookie consent requirements but increasing fines

The most obvious headline-grabber is the decision to scrap the requirement under the UK Privacy and Electronic Communications Regulations (“PECR”) for organizations to obtain the consent of UK residents when placing non-essential cookies on their devices.

The consultation response indicates that in the short term, the consent requirement will be removed for cookies used for a small number of “non-intrusive purposes” (including web analytics cookies). In the (indefinite) long-term, the Government indicates that it intends to remove the opt-in consent requirement entirely, and move to an opt-out model, but the impact of this may be limited. The Government has clarified that it will only happen when alternative solutions—such as browser-based solutions—are available, and the opt-out model will not apply to sites subject to the Age-Appropriate Design Code. The ongoing deprecation of third-party cookies for targeted advertising may also mean that the move to an opt-out model has a more limited practical impact.

Although the Government plans to relax some of the current cookie consent requirements, it will amend the powers of the ICO to grant it the same enforcement powers for infringements of PECR (which regulates cookies and direct marketing) as the GDPR. This includes allowing the ICO to serve assessment notices and increase the maximum fines available under PECR to the higher of £17.5m or 4% of an organization’s global turnover.

2. Granting controllers more scope to reject data subjects’ rights requests

The Government rowed back on its proposal to re-introduce fees for providing responses to data subjects’ rights requests in all cases. That said, it will proceed with proposals to amend Article 12(5) GDPR, which only permits controllers to refuse to act on a data subject access request or charge a reasonable fee where the request is “manifestly unfounded or excessive”. Going forward, controllers will be able to reject requests or charge fees where they are “vexatious or excessive”—this will bring the regime in line with that set out in the Freedom of Information Act (FOIA). This is a lower standard than is set out in the GDPR, but does not mean that controllers will be able to reject data subjects’ requests out of hand—it requires a case-by-case assessment of whether the request would create an unjustified or disproportionate level of disruption, irritation or distress. Importantly, though, it may allow controllers to take account of all the circumstances, including the motivation for the request.

3. Establishing use cases where companies do not need to conduct a balancing test when relying on legitimate interests

Organizations have expressed concerns that conducting a balancing test when relying on legitimate interests to process personal data (i.e., determining whether their legitimate interests outweigh the rights of data subjects) takes significant time and effort. In response, the Government has announced that it will create a limited list of processing activities for which organizations can rely on the legitimate interests legal basis under Article 6(1)(f) GDPR without conducting a detailed balancing test (provided that they can comply with the other obligations set out in the GDPR). The full list remains to be seen, but the consultation response indicates it is likely to include processing activities carried out to prevent crime, or that are necessary for other important reasons of public interest.

4. Expanding the scope to use personal data for scientific research

The Government aims to ensure that UK data protection legislation permits and even encourages companies and research institutions to use personal data for scientific research. It will make a number of changes to try to clarify the rules, including introducing a statutory definition of “scientific research”, and making clearer the circumstances in which data can be used (and re-used) for scientific research purposes. The Government will not, however, introduce a new legal basis explicitly covering scientific research, as it proposed in the original consultation.

5. Removing specific accountability requirements

Whilst the Government recognizes the importance of accountability as a fundamental principle under the GDPR, it is of the view that the ‘one size fits all’ approach places a disproportionate burden on SMEs.

Therefore, the Government intends to remove specific requirements to:

  1. appoint a data protection officer (Articles 37-39 GDPR);
  2. carry out data protection impact assessments (“DPIA”) (Article 35 GDPR); and
  3. maintain records of processing activities (“ROPs”) (Article 30 GDPR).

In its place, the Government will introduce a requirement for companies to have a ‘privacy management programme’. The aim is to allow companies to tailor their accountability programs more specifically to the processing they carry out. Among other things, the consultation response indicates such a programme might include the appointment of a suitable “senior individual” with responsibility for the programme, and implementing or leveraging existing risk management tools to help assess, identify, and mitigate privacy risks. In practice, the impact for many companies may be limited as the consultation response clarifies that existing ROPs and DPIAs can be leveraged as part of the programme.

6. Permitting use of sensitive data for monitoring and correcting bias in AI systems

The Government published its National AI Strategy (available here) in September 2021, where it outlined its intention to make the UK an AI superpower. As part of this, the Government will permit organizations to process sensitive personal data to monitor and correct bias in AI technologies, subject to specific safeguards (e.g., limitations on re-use of this data).

7. Lowering the threshold for data to be “anonymized”

The Government proposes changes to what constitutes “anonymized” data. The GDPR sets a particularly high bar, requiring companies to process data in a way that means the data subject is no longer identifiable using any means reasonably available, and arguably requiring them to revisit anonymized datasets over time to confirm that the anonymization measures still meet this standard. The Government intends to lower this bar, and will require organizations to take account of whether an individual is identifiable through “reasonable means” available to them at the time of the anonymization, or whether the controller or processor knows, or ought reasonably to know that passing personal data on to another controller or processor is likely to result in re-identification. The Government response does not mention the ICO’s open consultation on anonymization in this section, so it appears that these changes will supersede the forthcoming ICO guidance.

8. Changes to the ICO

In addition to these substantive changes, the Government plans to restructure the ICO to bring it in line with other regulators. The ICO is currently a “corporation sole”—an incorporated entity represented by a single individual: the Information Commissioner. Under the Government’s proposals, the ICO would be renamed and have a Chair, a CEO, and a Board of Directors, and be subject to a Statutory Statement of Purposes set by Parliament. The Government will also require the ICO to take account of competition, economic growth, and innovation when carrying out its duties.

These proposals were controversial during the consultation process, and many campaigners argued that this would undermine the independence of the ICO. Such independence is a key requirement for receiving an adequacy decision from the EU.

Next Steps

The Government is expected to introduce legislation to enact these reforms during the current parliamentary session which ends in April 2023.

*     *     *

Covington’s Data Privacy and Cybersecurity team has extensive experience in advising on UK data privacy laws and is well positioned to provide advice or answer any questions you may have on the impact of the UK Government’s data reforms on your organization.

Print:
Email this postTweet this postLike this postShare this post on LinkedIn
Photo of Mark Young Mark Young

Mark Young is an experienced tech regulatory lawyer and a vice-chair of Covington’s Data Privacy and Cybersecurity Practice Group. He advises major global companies on their most challenging data privacy compliance matters and investigations. Mark also leads on EMEA cybersecurity matters at the…

Mark Young is an experienced tech regulatory lawyer and a vice-chair of Covington’s Data Privacy and Cybersecurity Practice Group. He advises major global companies on their most challenging data privacy compliance matters and investigations. Mark also leads on EMEA cybersecurity matters at the firm. In these contexts, he has worked closely with some of the world’s leading technology and life sciences companies and other multinationals.

Mark has been recognized for several years in Chambers UK as “a trusted adviser – practical, results-oriented and an expert in the field;” “fast, thorough and responsive;” “extremely pragmatic in advice on risk;” “provides thoughtful, strategic guidance and is a pleasure to work with;” and has “great insight into the regulators.” According to the most recent edition (2024), “He’s extremely technologically sophisticated and advises on true issues of first impression, particularly in the field of AI.”

Drawing on over 15 years of experience, Mark specializes in:

  • Advising on potential exposure under GDPR and international data privacy laws in relation to innovative products and services that involve cutting-edge technology, e.g., AI, biometric data, and connected devices.
  • Providing practical guidance on novel uses of personal data, responding to individuals exercising rights, and data transfers, including advising on Binding Corporate Rules (BCRs) and compliance challenges following Brexit and Schrems II.
  • Helping clients respond to investigations by data protection regulators in the UK, EU and globally, and advising on potential follow-on litigation risks.
  • Counseling ad networks (demand and supply side), retailers, and other adtech companies on data privacy compliance relating to programmatic advertising, and providing strategic advice on complaints and claims in a range of jurisdictions.
  • Advising life sciences companies on industry-specific data privacy issues, including:
    • clinical trials and pharmacovigilance;
    • digital health products and services; and
    • engagement with healthcare professionals and marketing programs.
  • International conflict of law issues relating to white collar investigations and data privacy compliance (collecting data from employees and others, international transfers, etc.).
  • Advising various clients on the EU NIS2 Directive and UK NIS regulations and other cybersecurity-related regulations, particularly (i) cloud computing service providers, online marketplaces, social media networks, and other digital infrastructure and service providers, and (ii) medical device and pharma companies, and other manufacturers.
  • Helping a broad range of organizations prepare for and respond to cybersecurity incidents, including personal data breaches, IP and trade secret theft, ransomware, insider threats, supply chain incidents, and state-sponsored attacks. Mark’s incident response expertise includes:
    • supervising technical investigations and providing updates to company boards and leaders;
    • advising on PR and related legal risks following an incident;
    • engaging with law enforcement and government agencies; and
    • advising on notification obligations and other legal risks, and representing clients before regulators around the world.
  • Advising clients on risks and potential liabilities in relation to corporate transactions, especially involving companies that process significant volumes of personal data (e.g., in the adtech, digital identity/anti-fraud, and social network sectors.)
  • Providing strategic advice and advocacy on a range of UK and EU technology law reform issues including data privacy, cybersecurity, ecommerce, eID and trust services, and software-related proposals.
  • Representing clients in connection with references to the Court of Justice of the EU.
Photo of Jasmine Agyekum Jasmine Agyekum

Jasmine Agyekum advises clients on a broad range of technology, AI, data protection, privacy and cybersecurity issues. She focuses her practice on providing practical and strategic advice on compliance with the EU and UK General Data Protection Regulations (GDPR), EU e-Privacy laws and…

Jasmine Agyekum advises clients on a broad range of technology, AI, data protection, privacy and cybersecurity issues. She focuses her practice on providing practical and strategic advice on compliance with the EU and UK General Data Protection Regulations (GDPR), EU e-Privacy laws and the UK Data Protection Act. Jasmine also advises on a variety of policy proposals and developments in Europe, including on the EU’s proposed Data Governance Act and AI Regulation.

Jasmine’s experience includes:

  • Advising a leading technology company on GDPR compliance in connection with the launch of an ad supported video on demand and live streaming service.
  • Advising global technology companies on the territorial application of the GDPR and EU Member State data localization laws.
  • Representing clients in numerous industries, including, life sciences, consumer products, digital health and technology and gaming, in connection with privacy due diligence in cross-border corporate mergers & acquisitions.
  • Advising clients on responding to data breaches and security incidents, including rapid incident response planning and notifications to data protection authorities and data subjects.

Jasmine’s pro bono work includes providing data protection advice to a mental health charity in connection with their launch of a directory of mental health and wellbeing support to children and working with a social mobility non-profit organization focused on widening access to opportunities in the law to individuals from various socio-economic backgrounds.

Photo of Paul Maynard Paul Maynard

Paul Maynard is special counsel in the technology regulatory group in the London office. He focuses on advising clients on all aspects of UK and European privacy and cybersecurity law relating to complex and innovative technologies such as adtech, cloud computing and online…

Paul Maynard is special counsel in the technology regulatory group in the London office. He focuses on advising clients on all aspects of UK and European privacy and cybersecurity law relating to complex and innovative technologies such as adtech, cloud computing and online platforms. He also advises clients on how to respond to law enforcement demands, particularly where such demands are made across borders.

Paul advises emerging and established companies in various sectors, including online retail, software and education technology. His practice covers advice on new legislative proposals, for example on e-privacy and cross-border law enforcement access to data; advice on existing but rapidly-changing rules, such the GDPR and cross-border data transfer rules; and on regulatory investigations in cases of alleged non-compliance, including in relation to online advertising and cybersecurity.