Blog
“Recruitment Rewired”: what employers need to know about automated recruitment
Emily Carter
On 31 March 2026, the Information Commissioner’s Office (ICO) published its Report, “Recruitment Rewired: an update on the ICO’s work on the fair and responsible use of automation in recruitment”, setting out its findings and regulatory expectations for employers using AI‑enabled or automated tools in recruitment.
The Report is based on evidence gathered from over 30 employers between March 2025 and January 2026 and forms part of the ICO’s wider AI and biometrics strategy, under which automated decision‑making (ADM) in recruitment is a key regulatory priority.
The Report looks at how employers use automated tools across the recruitment lifecycle, including CV sifting and ranking, candidate scoring, online assessments and behavioural analysis and shortlisting and filtering decisions.
It is worth noting that the Report sits alongside recent changes to the law under the Data (Use and Access) Act 2025, which provides more flexibility in the use of ADM. Whereas Article 22 of the UK GDPR largely prohibited the use of ADM subject to narrow exceptions, the position since 5 February 2026 is that ADM may be used, but subject to certain safeguards and a right to challenge its use.
This article considers the findings of the Report and important points arising for employers.
1. Employers underestimate when ADM is taking place
Many organisations believed a human was ‘in the loop’ when using automated tools in recruitment, but the ICO found that in many cases, human involvement was not meaningful. The ICO stressed that token reviews or rubber‑stamping of automated outputs are insufficient; if a human cannot genuinely influence the outcome, the decision will be treated as automated.
The significance of this is that the use of ADM, which is defined as being a decision based solely on automated processing with no meaningful human involvement and which has a legal or similarly significant effect on a person (likely to include recruitment decisions), is subject to specific legal protections. This is distinguished from AI tools being used solely to support a human decision-maker.
The ICO found that many employers believed they were using tools as ‘decision-support’ (and therefore were not applying the additional safeguards required for ADM use) but their use, in fact, fell within the definition of ADM.
2. Lack of transparency for candidates
The ICO identified widespread shortcomings in how employers explain the use of automated tools to candidates. The ICO found that employers often provided only general information about the AI tools used, or referred to a third-party privacy policy, or both.
3. Bias and fairness risks are not being managed effectively
The ICO found that many of the employers it spoke to had not fully assessed the fairness of their processing or considered whether outcomes resulted in bias or discrimination. However, some employers were found to undertake regular bias reviews on outcomes and ensured that developers had undertaken fairness testing. The ICO encourages wider adoption of these measures.
4. Data protection impact assessments (DPIAs)
The Report states that not all employers had completed a full DPIA before processing personal information. Of the DPIAs reviewed by the ICO, many were found not to meet the requirements for DPIAs laid out in the UK GDPR. The ICO explained that employers need to assess the level of risk to people as a result of their processing more thoroughly and to take appropriate steps to mitigate those risks.
Although the use of ADM within recruitment has significant potential benefits to businesses with respect to the quality of the process, as well as the resources required, there are also significant compliance risks.
Failings in this area may expose employers to possible ICO enquiries or enforcement action, complaints and subject access requests from candidates, as well as discrimination claims. Online or media scrutiny may damage the reputation and recruitment strategy of the business.
Practical steps employers can take to manage these risks include the following:
1. Audit recruitment tools and ensure meaningful human involvement
Employers should map where automation or AI is used in recruitment and identify whether the decisions are, in fact, entirely automated with no ‘human in the loop’. They should not rely on labels such as ‘decision support’ without checking how the tool operates in practice.
The ‘humans in the loop’ must have real authority to change outcomes for a tool in order to not to fall within the ADM provisions of UK GDPR. Recruitment staff should also be trained on how to question and override automated outputs where appropriate.
The reasons for concluding that a particular process is or is not an ADM should also be documented. If the use of the tools in fact amounts to ADM, the additional legal safeguards within Article 22 UK GDPR must be complied with.
2. Improve transparency for candidates
Employers should review and update their recruitment privacy notices to clearly explain the use of automated tools (including a meaningful explanation of how they are used to make decisions), the significance and consequences for the individual, how accurate the tool is and what safeguards are in place. Candidates should also be informed of their rights to challenge decisions, and be able to do so in a timely manner.
3. Monitor fairness
AI tools should be tested for bias and discrimination before they are deployed and on an ongoing basis. Outcomes should be monitored by reference to specific protected characteristics (where lawful and proportionate) and employers should be prepared to pause or withdraw the use of those tools if they are found to generate unfair outcomes.
4. Carry out a compliant DPIA
Employers using automated recruitment tools should carefully assess whether to carry out a DPIA. Where the processing falls within the definition of ADM, they must carry out a DPIA.
Any DPIA should comply with all relevant requirements under UK GDPR and be kept under review.
Employers should review the lawful bases they rely on to process personal information. The ICO commented in its Report that many employers rely on the lawful bases of consent and/or performance of a contract, which are unlikely to be an appropriate basis for processing personal information in most recruitment contexts.
In the Report, the ICO signals its intention to publish clearer regulatory expectations, write to organisations likely to be using ADM and to take enforcement action where employers fail to respect candidates’ rights.
Employers should therefore expect continued regulatory focus on AI and automation in the course of employment relationships (and particularly in recruitment) and should proactively address issues now rather than waiting for complaints or an investigation.
It is clear that, if used carefully, automated tools can deliver efficiency and consistency. The message from the ICO is not to stop using these tools but to do so responsibly, transparently and lawfully.
If you have any questions regarding this blog, please contact Kirsty Churm and Özlem Mehmet in our Employment team or Emily Carter in our Public Law team.
Emily is a partner within the Public Law team specialising in information law, inquests, inquiries and internal investigations. Her background in criminal and regulatory proceedings, both defending and prosecuting, equips her to fully support clients involved in complex investigative processes.
Kirsty is a Partner in the Employment Department. She advises both employers and senior employees on all aspects of employment law and employee relations issues, including contentious and non-contentious matters.
Özlem is a Senior Professional Support Lawyer in our Employment Team.
On 31 March 2026, the Information Commissioner’s Office (ICO) published its Report, “Recruitment Rewired: an update on the ICO’s work on the fair and responsible use of automation in recruitment”, setting out its findings and regulatory expectations for employers using AI‑enabled or automated tools in recruitment.
A significant number of employment law reforms are coming into effect in 2026 and 2027 following the introduction of the Employment Rights Act 2025 at the end of last year.
The report on Phase 1 of the Southport Inquiry into the murder of three young girls and injury of 10 others during an attack on a children’s dance club was published on 13 April 2026. The Inquiry has examined the perpetrator’s online activity in the lead up to the attack which demonstrates that he had accessed and viewed a variety of violent content, including through using computer systems at his school.
In March 2026 the Government launched a wide-ranging consultation on children’s access to the internet, ‘Growing up in the online world: a national consultation’. As part of that consultation, the Government is looking at whether there should be a minimum age for children to access social media, or certain social media features, and what age would be right. The consultation follows Australia’s recent ban on social media for under 16s, and an announcement by the Greek government that they will ban access to social media for children under 15 years old.
At our recent Tech Briefing, 'What tech businesses need to know in 2026', we explored how the EU’s Digital Omnibus package and the UK’s Employment Rights Act will reshape compliance for UK tech SMEs.
Fred Allen looks at ECHR reform and lists out five other public law developments to look out for in 2026
As 2026 begins, the UK is entering a period of the most substantial reform of employment rights in a generation. The Employment Rights Act 2025 (“ERA 2025”) became law in December 2025 following extensive Parliamentary debate and marks a decisive shift in the balance between employers and workers. Overall, ERA 2025 represents a material strengthening of workers’ rights in the UK, bringing employment protections closer to European standards in several key respects.
The Court of Appeal’s judgment in the important whistleblowing cases Wicked Vision and Barton Turns highlights the need for legislative reform of the UK’s outdated and ineffective rules on workplace whistleblowing. To quote from its final sentence:
The festive season is a time for joy, connection, and celebration. Yet for employers, it also brings heightened risks. Work social events, whether Christmas parties, drinks after work, or team dinners, are legally considered an extension of the workplace. That means employers can be held liable for misconduct that occurs at these gatherings, even when no harm was intended.
Frontline legal services have the most to gain from artificial intelligence, but also face unique challenges in its provision.
This week is Anti-Bullying Week, an important opportunity to reflect on workplace culture and the need to create environments where respect and inclusion are the norm. Despite increased attention on this issue, recent research highlights that one in seven workers has experienced bullying at work, so there is clearly room for improvement and progress.
A recent High Court decision highlights the importance of seeking legal advice when dealing with exit negotiations involving share plans. In this case, the High Court found that the CEO of Global Data plc did not exercise discretion under a share plan to allow the employee to retain and exercise his share options beyond the termination of his employment. However, the employee was still entitled to a remedy under equitable principles because of the assurances made to him.
To scale up successfully will necessarily involve increasing headcount. It is crucial for tech companies to understand the challenges that come with a growing workforce. From hiring practices to contract structuring and managing flexible workforces, this article discusses the key employment law lessons for scaling tech teams.
After many years of campaigning, and further to the Government’s commitment to introduce a statutory duty of candour for public bodies, the Public Office (Accountability) Bill 2025 (also known as the ‘Hillsborough Law’ bill) was introduced last week.
The COVID pandemic was a difficult time for businesses, and many legitimately relied on financial support provided through government schemes to help them to survive and retain employees. However, it is estimated by HMRC that circa £10billion was also lost as a result of incorrect applications and outright fraud.
The new Independent Football Regulator (the “IFR”), which will oversee a new regulatory regime designed to protect and promote the sustainability of English men’s elite football, reached a significant milestone last week.
In June the Ministry of Justice announced new legislation under the Victims and Prisoners Act 2024 which affects NDAs and confidentiality clauses.* Related guidance, published at the beginning of June, sets out the impact of this legislation on the enforceability of such agreements.
Artificial Intelligence (AI) and digital tools are rapidly transforming the accountancy sector with promises of enhanced efficiency, insight and audit quality. Embracing this innovation wave however, does not come without risk, and regulators are increasingly alert to the ethical implications. The FRC has very recently issued new guidance on the use of AI in audit, coinciding with the ICAEW’s new technology-centred revisions to its Code of Ethics, which came into force on 1 July 2025. Responsible and ethical use of AI is now therefore no longer optional, but a regulatory expectation.
Digital nomadism - working remotely from outside the UK - is on the rise. Some estimates suggest 165,000 British citizens are living and working abroad as digital nomads for on average seven months of the year. But allowing staff to work overseas, even temporarily, can trigger a complex mix of immigration, tax, and employment law issues.
In Darwall and another v Dartmoor National Park Authority [2025] UKSC 20 (21 May 2025), the Supreme Court unanimously upheld the public’s right to “wild camp” on the Dartmoor Commons (“the Commons”). Although the judgment only concerns Dartmoor, which is subject to specific legislation, it has rekindled a wider debate about public rights of access to nature across England and Wales.
Or call +44 (0)20 7814 1200
Emily Carter
Andy Norris
Clodagh Hogan
Skip to content Home About Us Insights Services Contact Accessibility
Share insightLinkedIn X Facebook Email to a friend Print