ICO publishes abstract of police facial recognition audit
The Data Commissioner’s Workplace (ICO) has accomplished its first-ever information safety audit of UK police forces deploying facial recognition applied sciences (FRT), noting it’s “inspired” by its findings.
The ICO’s audit, which investigated how South Wales Police and Gwent Police are utilizing and defending folks’s private data when deploying facial recognition, marks the primary time the info regulator has formally audited a UK police power for its use of the expertise.
In keeping with an govt abstract revealed on 20 August, the scope of the facial recognition audit – which was agreed with the 2 police forces beforehand – centered on questions of necessity and proportionality (a key authorized take a look at for the deployment of recent applied sciences), whether or not its design meets expectations round equity and accuracy, and whether or not “the end-to-end course of” is compliant with the UK’s information safety guidelines.
“We’re inspired by the findings, which give a excessive degree of assurance that the processes and procedures at present in place at South Wales Police and Gwent Police are compliant with information safety regulation,” stated the deputy commissioner for regulatory coverage, Emily Keaney, in a weblog publish.
“The forces made positive there was human oversight from educated workers to mitigate the chance of discrimination and guarantee no selections are solely automated, and a proper utility course of to evaluate the need and proportionality earlier than every LFR deployment,” she wrote.
The chief abstract added that South Wales Police and Gwent Police have “comprehensively mapped” their information flows, can “display the lawful provenance” of the photographs used to generate biometric templates, and have acceptable information safety impression assessments (DPIAs) in place.
It additional added that the info collected “is satisfactory, related and restricted to what’s essential for its function”, and that people are knowledgeable about its use “in a transparent and accessible method”.
Nonetheless, Keaney was clear that the audit solely “serves as a snapshot in time” of how the expertise is being utilized by the 2 police forces in query. “It doesn’t give the inexperienced mild to all police forces, however these wishing to deploy FRT can study from the areas of assurance and areas for enchancment revealed by the audit abstract,” she stated.
Commenting on the audit, chief superintendent Tim Morgan of the joint South Wales and Gwent digital companies division, stated: “The extent of oversight and impartial scrutiny of facial recognition expertise signifies that we at the moment are in a stronger place than ever earlier than to have the ability to display to the communities of South Wales and Gwent that our use of the expertise is honest, authentic, moral and proportionate.
“We welcome the work of the Data Commissioner’s Workplace audit, which gives us with impartial assurance of the extent to which each forces are complying with information safety laws.”
He added: “It is very important keep in mind that use of this has by no means resulted in a wrongful arrest in South Wales and there have been no false alerts for a number of years because the expertise and our understanding has developed.”
Lack of element
Whereas the ICO offered numerous suggestions to the police forces, it didn’t present any specifics within the govt abstract past the precedence degree of the advice and whether or not it utilized to the forces’ use of dwell or retrospective facial recognition (LFR or RFR).
For LFR, it stated it made 4 “medium” and one “low” precedence suggestions, whereas for RFR, it stated it made six “medium” and 4 “low” precedence suggestions. For every, it listed one “excessive” precedence advice.
Laptop Weekly contacted the ICO for extra details about the suggestions, however obtained no response on this level.
Though the abstract lists some “key areas for enchancment” round information retention insurance policies and the necessity to periodically evaluate numerous inner procedures, key questions concerning the deployments are left unanswered by the ICO’s revealed materials on the audit.
For instance, earlier than they’ll deploy any facial recognition expertise, UK police forces should guarantee their deployments are “authorised by regulation”, that the resultant interference with rights – equivalent to the best to privateness – is undertaken for a legally “recognised” or “authentic” intention, and that this interference is each essential and proportionate. This should be assessed for every particular person deployment of the tech.
Nonetheless, past noting that processes are in place, no element was offered by the ICO on how the police forces are assessing the need and proportionality of their deployments, or how these are assessed within the context of watchlist creation.
Though extra element on proportionality and necessity concerns is offered in South Wales Police’s LFR DPIA, it’s unclear if any of the ICO’s suggestions concern this course of.
Whereas police forces utilizing facial recognition have lengthy maintained that their deployments are intelligence-led and focus solely on finding people needed for severe crimes, senior officers from the Metropolitan Police and South Wales Police beforehand admitted to a Lords committee in December 2023 that each forces choose photos for his or her watchlists based mostly on crime classes hooked up to folks’s photographs, fairly than a context-specific evaluation of the risk introduced by a given particular person.
Laptop Weekly requested the ICO whether or not it is ready to affirm if that is nonetheless the method for choosing watchlist photos at South Wales Police, in addition to particulars on how nicely police are assessing the proportionality and necessity of their deployments typically, however obtained no response on these factors.
Whereas the ICO abstract claims the forces are in a position to display the “lawful provenance” of watchlist photos, the regulator equally didn’t reply to Laptop Weekly’s questions on what processes are in place to make sure that the tens of millions of unlawfully held custody photos within the Police Nationwide Database (PND) aren’t included in facial recognition watchlists.
Laptop Weekly additionally requested why the ICO is just starting to audit police facial recognition use now, on condition that it was first deployed by the Met in August 2016 and has been controversial since its inception.
“The ICO has performed an energetic function within the regulation of FRT since its first use by the Met and South Wales Police round 10 years in the past. We investigated the usage of FRT by the Met and South Wales and Gwent police and produced an accompanying opinion in 2021. We intervened within the Bridges case on the aspect of the claimant. We now have produced follow-up steerage on our expectations of police forces,” stated an ICO spokesperson.
“We’re stepping up our supervision of AI [artificial intelligence] and biometric applied sciences – our new technique features a particular give attention to the usage of FRT by police forces. We’re conducting an FRT in Policing challenge beneath our AI and biometrics technique. Audits type a core a part of this challenge, which goals to create clear regulatory expectations and scalable good observe that may affect the broader AI and biometrics panorama.
“Our suggestions in a given audit are context-specific, however any findings which have applicability to different police forces will probably be included in our Outcomes Report due in spring 2026, as soon as we’ve got accomplished the remainder of the audits on this collection.”
EHRC joins judicial evaluate
In mid-August 2025, the Equality and Human Rights Fee (EHRC) was granted permission to intervene in an upcoming judicial evaluate of the Met Police’s use of LFR expertise, which it claims is being deployed unlawfully.
“The regulation is evident: everybody has the best to privateness, to freedom of expression and to freedom of meeting. These rights are important for any democratic society,” stated EHRC chief govt John Kirkpatrick.
“As such, there should be clear guidelines which assure that dwell facial recognition expertise is used solely the place essential, proportionate and constrained by acceptable safeguards. We consider that the Metropolitan Police’s present coverage falls wanting this commonplace.”
He added: “The Met, and different forces utilizing this expertise, want to make sure they deploy it in methods that are per the regulation and with human rights.”
Writing in a weblog concerning the EHRC becoming a member of the judicial evaluate, Chris Pounder, director of knowledge safety coaching agency Amberhawk, stated that, in his view, the assertion from Kirkpatrick is “exactly the form of assertion that ought to have been made by” data commissioner John Edwards.
“As well as, the ICO has burdened the necessity for FRT deployment ‘with acceptable safeguards in place’. If he [Edwards] joined the judicial evaluate course of as an social gathering, he might get judicial approval for these a lot vaunted safeguards (which no one has seen),” he wrote.
“As an alternative, the ICO sits on the fence while others decide whether or not or not present FRT processing by the Met Police is ‘strictly essential’ for its regulation enforcement capabilities. The house secretary, for her half, has promised a code of observe which is able to comprise an inevitable bias in favour of the deployment of FRT.”
In an look earlier than the Lords Justice and House Affairs Committee on 8 July, dwelling secretary Yvette Cooper confirmed the federal government is actively working with police forces and unspecified “stakeholders” to attract up a brand new governance framework for police facial recognition.
Nonetheless, she didn’t touch upon whether or not any new framework could be positioned on a statutory footing.