The British Transport Police (BTP) will deploy stay facial recognition (LFR) know-how at key transport hubs throughout London as a part of a six-month “trial”.
The primary deployment passed off at London Bridge railway station on 11 February 2026, however the power claims that the long run dates and areas of all BTP LFR operations can be revealed on-line earlier than they happen.
Chief superintendent Chris Casey, BTP’s senior officer overseeing the mission, stated: “The initiative follows a major quantity of analysis and planning, and varieties a part of BTP’s dedication to utilizing revolutionary know-how to make the railways a hostile place for people needed for critical prison offences.”
A BTP press launch added that people who find themselves not included on a watchlist can’t be recognized, and made assurances about how data can be dealt with: “Individuals who desire to not enter the popularity zone can have different routes out there and pictures of anybody not on the authorised database can be deleted instantly and completely.”
The choice to deploy LFR at main transport hubs occurred whereas the Residence Workplace’s 10-week session on LFR – which ended on 12 February 2026 – was nonetheless ongoing. The session allowed events and members of the general public to share their views on how the controversial know-how must be regulated.
It additionally follows house secretary Shabana Mahmood’s late January 2026 announcement of sweeping police reforms, which is able to see the biggest ever roll-out of facial recognition know-how within the UK.
Human rights group Liberty, which received the primary authorized problem towards police use of the tech in August 2020, beforehand urged the federal government to halt the growth of police LFR whereas the session is happening.
In December 2025, the Residence Workplace stated: “Though a patchwork authorized framework for police facial recognition exists, it doesn’t give police themselves the boldness to make use of it at a considerably larger scale … nor does it constantly give the general public the boldness that it will likely be used responsibly.”
Responding to the BTP’s announcement, Inexperienced London Meeting member Zoë Garbett advised Pc Weekly: “With the federal government’s session solely simply closed, urgent forward with growth makes a mockery of the method. What’s the purpose of asking for public views if deployment continues regardless?”
On 10 February 2026, Garbett dismissed the police declare that LFR is a “exact” instrument, highlighting how practically each watchlist used is bigger than the one previous it.
Dwell facial recognition know-how topics everybody to fixed surveillance, which fits towards the democratic precept that individuals shouldn’t be monitored except there may be suspicion of wrongdoing Zoë Garbett, Inexperienced London Meeting
Garbett additionally known as for brand spanking new safeguards to be carried out that she feels would shield Londoners from “escalating” biometric surveillance.
She issued 4 key coverage suggestions in a report revealed the identical day, which known as for the Met Police to right away cease utilizing LFR know-how and publish all true monetary and operational prices of its deployments.
“Dwell facial recognition know-how topics everybody to fixed surveillance, which fits towards the democratic precept that individuals shouldn’t be monitored except there may be suspicion of wrongdoing,” she stated.
Pc Weekly contacted the British Transport Police about its determination to deploy the know-how earlier than the Residence Workplace’s session had completed, in addition to the considerations raised about its use in Garbett’s report.
“A response to the session is because of be revealed inside 12 weeks of the time limit, and we are going to after all have interaction with regardless of the findings could also be,” stated a BTP spokesperson.
In addition they pointed to feedback made by BTP chief superintendent Casey, who pressured that the power is dedicated to utilizing LFR ethically and in step with privateness safeguards: “Deployments will adjust to all related authorized and regulatory requirements, and oversight will embody inner governance and exterior engagement with ethics and unbiased advisory teams.
“When the pilot is full, we’ll conduct a full evaluation to evaluation outcomes, determine classes realized and inform future planning. I encourage anybody who encounters our use of LFR when the trial begins to have interaction with us so we are able to ensure that we’re utilizing it in one of the simplest ways and serving to to make our railways as secure as attainable.”
Questionable authorized foundation
In August 2020, the Courtroom of Enchantment discovered that South Wales Police had been deploying LFR unlawfully, on the grounds that there have been inadequate constraints on the power’s discretion over the place LFR might be used and who might be positioned on a watchlist.
Underneath the Synthetic Intelligence Act in Europe, authorities’ use of stay facial recognition is usually prohibited and restricted solely to distinctive circumstances, corresponding to stopping an imminent terror assault.
Even then, safeguards apply, corresponding to a transparent authorized foundation in nationwide legislation and judicial authorisation.
Racial bias
On 27 January, the Excessive Courtroom heard a landmark authorized problem to the Metropolitan Police’s use of LFR cameras by anti-knife campaigner Shaun Thompson, who was misidentified by the know-how and threatened with arrest.
Forward of the Excessive Courtroom listening to, Huge Brother Watch director and claimant Silkie Carlo stated: “We’re completely out of step with the remainder of Europe on stay facial recognition. This is a chance for the courtroom to uphold our democratic rights and instigate much-needed safeguards towards intrusive AI-driven surveillance.”
The Excessive Courtroom listening to was the primary authorized problem in Europe introduced by somebody misidentified by LFR know-how, which Thompson, a 39-year-old Black man, described as “cease and search on steroids”.
Within the listening to, Thompson and Carlo’s legal professionals argued that the Met’s coverage on the place it could deploy LFR “confers far too broad a discretion on particular person officers” who can designate areas as “crime hotspots” primarily based on “operational expertise as to future criminality, which is opaque and fully subjective”.
The Met, alternatively, argued that as a result of officers’ discretion round LFR deployments just isn’t unconstrained, the case just isn’t a legality challenge, asserting that “as long as the courtroom is glad there may be not unfettered discretion on the constable deciding the place to find LFR, [there] just isn’t a maintainable legality problem”.
The Met added: “As a result of there aren’t any components of the coverage that enable unfettered discretion for an officer so as to add whomever she or he desires to a watchlist or place the LFR digital camera wherever she or he needs … there isn’t any maintainable assault on the coverage as missing the standard of legislation.”
Analysis led by Garbett has proven that over half of all LFR deployments in 2024 passed off in areas with higher-than-average Black populations, together with Thornton Heath, Croydon (40%), Northumberland Park, Haringey (36%), and Deptford Excessive Road, Lewisham (34%).
Total, LFR is disproportionately utilized in areas with greater populations of Black, Asian and Combined ethnic teams.
A 2019 examine evaluating 189 totally different algorithms discovered they had been between 10 and 100 instances extra more likely to misidentify Black and Asian faces in contrast with white faces. Black ladies had the best charge of false optimistic matches.
In January 2023, Newham Council unanimously handed a movement to droop the usage of LFR all through the borough till biometric and anti-discrimination safeguards are in place.
The movement highlighted LFR’s potential to exacerbate racist outcomes in policing, on condition that Newham is probably the most ethnically numerous of all native authorities in England and Wales.
In April 2023, testing by the Nationwide Bodily Laboratory discovered that the facial recognition algorithms utilized by the Met and South Wales Police – which is able to now be adopted by BTP – had “no statistical significance between demographic efficiency” if sure settings are used.
Nevertheless, critics stated on the time that regardless of enhancements within the algorithm’s accuracy, LFR can nonetheless entrench patterns of discrimination, as surveillance has traditionally tended for use towards communities of color.
Jake Hurfurt, investigations lead at Huge Brother Watch, has beforehand stated: “It’s not simply bias within the know-how, it’s the way it’s used. The watchlist is disproportionately focused in the direction of ethnic minorities, so that you get totally different outcomes.”
Whereas police repeatedly declare that LFR is getting used solely for critical and violent offenders, watchlists frequently include photographs of individuals needed for drug, shoplifting or site visitors offences, none of which legally meet this definition.
A 2025 paper by teachers Karen Yeung and Wenlong Li highlighted “unresolved questions” concerning the legality of watchlist composition, particularly the “significance and seriousness” of the underlying offence used to justify an individual’s inclusion, and the “legitimacy of the explanation why that particular person is ‘needed’ by the police” within the first place.