On 31 March 2026, the UK’s Information Commissioner’s Office (“ICO”) launched a public consultation on draft updated guidance on automated decision-making (“ADM”), including profiling (“Draft Guidance”) and simultaneously published a report on the use of ADM in recruitment (“Recruitment Report”).
The Draft Guidance is the ICO’s first detailed interpretation of the Data (Use and Access) Act’s (“DUAA”) changes to the UK GDPR’s ADM provisions, and the accompanying Recruitment Report is a sector-specific signal of how the ICO expects those rules to operate in practice.
1. Draft Updated Guidance on ADM, Including Profiling
According to the ICO, the DUAA’s amendments “reframe the ADM provisions from a prohibition with exceptions to a right of challenge with safeguards.” The Draft Guidance sets out the ICO’s key interpretations of how the revised UK GDPR rules should apply in practice, including with respect to:
- Scope of “decision” and what is “significant”: The Draft Guidance appears to take a broad view of what constitutes a “decision,” covering outcomes reached after analysis or consideration that may affect actions taken or engage a person’s rights. Notably, systems that merely apply rules already set by a human — for example, accepting or declining payment cards by type — fall outside the scope. The Draft Guidance also clarifies that whether a decision has “significant” effect may depend on the context, with certain decisions having a significant impact on some people but not on others (for example, the automated decision to freeze someone’s bank account). If that’s the case, the Draft Guidance encourages organisations to apply the UK GDPR’s Article 22C safeguards to all decisions unless they are confident that they can accurately separate those who will experience legal or similarly significant effects from those who will not.
- Meaningful human involvement: The Draft Guidance reinforces the ICO’s long-standing view that, in order for processing to not be considered “solely” automated, human involvement must be active rather than tokenistic. The Guidance includes a non-exhaustive list of criteria that should be met for human review to be considered “meaningful.” For example, the list notes that the individual conducting the oversight should be trained and qualified to understand the system’s logic, outputs, limitations, and risks. Importantly, the Draft Guidance also posits that a human designing or building an automated system does not, in and of itself, constitute meaningful human involvement, as the design and development of the system occurs before any real-world decisions are made. Therefore, in the ICO’s view, it cannot directly influence a specific outcome.
- Restrictions on ADM: The Guidance clarifies that the UK GDPR permits the use of ADM, subject to two restrictions: first, ADM that involves special category data is generally prohibited, and may only occur if one of the following conditions is met: (i) the decision is based entirely on the person’s explicit consent; or (ii) there is a substantial public interest (the types available are listed here) and the decision is either necessary for a contract between the person and the organisation or is required or authorised by law. Second, organisations cannot use the “recognised legitimate interest” concept introduced by the DUAA as the lawful basis for carrying out ADM. A “recognised legitimate interest” is a pre-approved purpose for using personal information in the public interest. The Guidance clarifies that “legitimate interest” and “recognised legitimate interest” are two separate legal bases (and the latter cannot be used for ADM).
- ADM-specific information rights: The Draft Guidance identifies three points when organisations must provide information about their ADM activities: (i) when they first collect people’s information (to comply with transparency provisions and the right to be informed); (ii) when people ask for their information (to comply with the right of access); and (iii) when they engage in ADM (to comply with ADM safeguards). Notably, the Guidance provides that organisations should not use “overly-technical or complex explanations of algorithms or how code works,” to avoid confusing people. This is particularly important when providing age-appropriate disclosures to children subject to ADM. Additional guidance from the ICO on explaining decisions made by AI can be found here.
The Guidance closes with some additional practical considerations, including the likely need to conduct a DPIA when engaging in ADM and the importance of ensuring there are adequate mechanisms in place to diagnose quality issues and assess whether systems are working properly.
The consultation is open until 29 May 2026.
2. Recruitment Report
The Recruitment Report is worth reading alongside the Draft Guidance because it gives more concrete insight into how the ICO expects the revised ADM rules to apply in practice. The Report draws on voluntary engagement with more than 30 employers between March 2025 and January 2026, together with public perceptions research and the ICO’s prior audits of providers and developers of AI recruitment tools (which we discussed here).
In the Report’s executive summary, the ICO acknowledges that recruitment tools can play a legitimate role—in particular, automation can help employers process large volumes of applications quickly and consistently, and it supports innovation in the use of these tools. At the same time, the ICO highlights that public perceptions are mixed: the public can see value in tools such as CV filtering, but is more wary of more profiling-based forms of automation, including online behavioural assessments, because of concerns about opacity, unfairness, and bias.
The Report’s central compliance point is that many employers may be underestimating when their recruitment tools are actually making decisions, rather than merely supporting human decision-makers. According to the ICO, many employers characterised their use of automated recruitment tools as decision-support, but the regulator’s evidence suggested that, in practice, there was often no meaningful human involvement and the tools were producing decisions with legal or similarly significant effects on candidates. The ICO therefore concludes that a wider range of safeguards, including those discussed in the rest of the Report, may be required than the evidence suggests are currently in place.
In addition, the Report highlights that the ICO expects greater transparency about the use of ADM—not just general information about the tools and references to third-party privacy policies. According to the ICO, candidates should understand when automation is being used, how accurate the tool is, and what safeguards are available. The ICO also emphasises the need to include meaningful information about the logic involved, while acknowledging that this must be balanced against the risk that too much information could risk allowing candidates to “game the system.” In addition, the relevant safeguards for ADM must be in place—including the ability of candidates to make representations about and contest decisions made about them, and to request human intervention.
The Report also calls for more robust fairness and bias monitoring. According to the ICO, their findings suggest that employers may not be fully considering the specific circumstances and context of processing, or putting in place appropriate technical and organisational measures to ensure that processing is fair. However, the Report also highlights some good practices they came across—for example, hiring managers only seeing relevant information about candidates or required outputs (which may reduce the risk of unintended bias), and employers providing reasonable adjustments for people with disabilities who were unable to access automated recruitment methods (e.g., via a phone interview).
Finally, the Report also points to broader governance expectations, including the expectation that employers have a valid lawful basis for processing personal information for ADM and conduct complete and thorough data protection impact assessments (DPIAs) where required (which, the Report notes, is likely when using ADM in recruitment).
As a follow up, the ICO plans to update their guidance on recruitment and selection in 2026 to take into account recent changes brought by the DUAA.
* * *
The Draft Guidance is an important indicator of the ICO’s emerging interpretation of the DUAA’s ADM reforms. The Recruitment Report, meanwhile, shows that recruitment remains a priority use case for ICO scrutiny and offers a practical preview of the controls the regulator expects organisations to have in place.