Technology

Met Police to ‘trial’ handheld facial-recognition tech


The Metropolitan Police are set to trial handheld facial-recognition expertise that may enable officers to conduct biometric checks on the spot, the Mayor of London has confirmed.

Often called Operator-Initiated Facial Recognition (OIFR), the expertise makes use of a cell phone app to seize photos of individuals’s faces and examine them to police databases in real-time.

London Mayor Sadiq Khan mentioned that OIFR would enable officers to verify and confirm the small print of any people stopped, as an alternative of getting to arrest them and take them to a police station. He added that the six-month pilot includes round 100 gadgets, with roughly £763,000 allotted to the programme.

Nevertheless, the Met’s web site nonetheless states on the time of publication that it “doesn’t presently use the so-called operator-initiated facial recognition”, with the knowledge solely changing into public after Khan was pressed in regards to the expertise’s use by Inexperienced London Meeting member Zoe Garbett. She added in the course of the assembly that this improvement is “alarming” and modifications the connection between the police and the general public.

The OIFR trial revelation comes because the House Workplace remains to be formulating a response to a session it held on a brand new authorized framework for facial recognition expertise, and the Excessive Courtroom remains to be deliberating on a judicial assessment over whether or not the Met has used the stay model of the expertise lawfully. Thus far, solely a restricted variety of forces have used OIFR expertise, through a joint trial between South Wales, Gwent and Cheshire Police.

Commenting additional on the Met’s deliberate trial, Garbett mentioned it’s “stunning” that the details about the trial was solely divulged via a Mayor’s query time.

“Londoners deserve transparency on the subject of such a basic enlargement of police powers. What’s much more regarding is the Met’s web site explicitly says they don’t use this expertise,” she mentioned.

“We already haven’t any clear authorized framework for stay facial recognition and now it’s being additional expanded with handheld gadgets that enable officers to stroll up and scan individuals’s faces. In Britain, nobody has to determine themselves to police with out excellent purpose and this unregulated expertise threatens that basic proper.”

She added that with the federal government’s session solely closing on 12 February 2026, urgent forward with the enlargement of facial recognition “makes a mockery of the method – what’s the purpose of asking for public views if the enlargement of surveillance expertise continues regardless? The fast and unchecked deployment of this expertise should cease and sturdy protections should be put in place to safeguard our rights.” 

Garbett beforehand known as for the power to right away halt its deployments of LFR in early February 2026, citing its disproportionate results on Black and brown communities, a scarcity of particular authorized powers dictating how police can use the tech, and the Met’s opacity across the true prices of deploying.

Khan mentioned that each the Mayor’s Workplace for Policing and Crime and the London Policing Ethics Panel would oversee using the expertise, guaranteeing its use was “proper and proportionate”, and that as a result of it was solely a pilot, “it will not be rolled out”.

Whereas Khan famous that OIFR captured-images are in comparison with custody data held by the Met, tens of millions of custody photos proceed to be held unlawfully within the UK-wide Police Nationwide Database (PND), regardless of the Excessive Courtroom ruling in 2012 that photos of unconvicted individuals should be deleted.

Khan beforehand advised the London Meeting that if the Met had been to deploy operator-initiated facial recognition, “I might count on the MPS to seek the advice of stakeholders, together with the London policing ethics panel, in addition to undertake cautious consideration of authorized, coverage, group, knowledge safety and moral impacts.”

Lindsey Chiswick, the Met’s lead for facial recognition, who was current within the LFR judicial assessment proceedings on behalf of the power, mentioned: “We’re set to trial operator‑initiated facial recognition, an progressive instrument which can assist our officers take photographs to assist affirm the identities of individuals shortly and precisely, avoiding the necessity to detain individuals for longer than wanted.

“This can initially be rolled out to a small variety of officers whereas we take a look at the expertise. If a person has their photograph taken and there’s no match, then their biometric data will probably be deleted immediately.”

Jasleen Chaggar, a authorized and coverage officer at Huge Brother Watch, advised the Native Democracy Reporting Service (LDRS) there was no coverage in place for the OIFR and that the police had been utilizing the general public like “guinea pigs” to check their surveillance expertise.

“Putting a instrument within the hand of officers which might raise the veil of anonymity in public in a matter of seconds by merely pointing a cellphone at a face is a catastrophe for civil liberties,” she mentioned, including that the expertise might be used to “unlock an enormous array of non-public data”.

She added: “The Met has a historical past of rolling out facial recognition so-called ‘pilots’ that quietly turn out to be everlasting fixtures – they need to instantly halt OIFR trials till the House Workplace carry ahead clear legal guidelines that strictly restrict and safeguard in opposition to its on a regular basis use.”

Earlier problematic trials

In September 2025, lecturers Karen Yeung and Wenlong Li revealed a comparative research of LFR trials by regulation enforcement companies in London, Wales, Berlin and Good.

They discovered that though “in-the-wild” testing is a vital alternative to gather details about how synthetic intelligence (AI)-based techniques comparable to LFR carry out in real-world deployment environments, the trials carried out thus far have didn’t have in mind the socio-technical impacts of the techniques in use, or to generate clear proof of the operational advantages.

They concluded that real-world testing of stay facial recognition (LFR) techniques by UK and European police is a largely ungoverned “Wild West”, the place the expertise is examined on native populations with out sufficient safeguards or oversight.

Highlighting the instance of the Met’s LFR trials – carried out throughout 10 deployments between 2016 and 2020 – Yeung and Li mentioned the characterisation of those checks as “trials” is “significantly questionable” given their resemblance to energetic police operations.

“Though described as ‘trials’ to publicly point out that their use on these events didn’t essentially replicate a call to undertake and deploy FRT on a everlasting foundation, they had been decidedly ‘actual’ within the authorized and social penalties for these whose faces triggered a match alert,” they wrote, including that this implies the trials had been restricted to the techniques operational efficiency in relation to a selected organisational consequence (making arrests), moderately than trying to guage its wider socio-technical processes and impacts.

One other July 2019 paper from the Human Rights, Huge Information & Expertise Challenge based mostly on the College of Essex Human Rights Centre – which marked the primary impartial assessment into trials of LFR expertise by the Metropolitan Police – beforehand noticed a discernible “presumption to intervene” amongst cops utilizing the expertise.

In response to authors Pete Fussey and Daragh Murray, this implies the officers concerned tended to behave on the outcomes of the system and have interaction people that it mentioned matched the watchlist in use, even when they didn’t.

As a type of automation bias, the “presumption to intervene” is vital in a socio-technical sense, as a result of in apply it dangers opening up random members of the general public to unwarranted or pointless police interactions.