← Back to Blog

    Save Face: Building Britain's Surveillance State

    Britain's facial recognition consultation closes 12 February 2026. Do not take the government's case at face value. Here is what is at stake and how to fight back.

    Privacy AnalysisPublished · 50 min read· By TheVPNMatrix.com

    Evidence-based review per our 28-criteria methodology · affiliate disclosure

    A sitting Home Secretary has told the British public that her vision for the criminal justice system is to achieve, by means of AI and technology, what Jeremy Bentham tried to do with his Panopticon: "the eyes of the state can be on you at all times" (Mahmood, S., 2026). Not a warning. Not a dystopian novel. A stated policy aspiration from the minister responsible for policing in the United Kingdom, published in January 2026.

    While that sentence sits in the public record, the Home Office is running a consultation on whether to give police forces a statutory green light to scan every face in every crowd, search 200 million photographs held in passport, immigration, and driving licence databases, and build permanent facial recognition infrastructure across British cities (Home Office, 2025). The consultation closes on 12 February 2026. Twelve days from now. This is what you need to know, why it matters, and what you can do about it.

    Executive summary

    The UK Home Office launched a consultation on 4 December 2025 seeking to establish a legal framework for police use of facial recognition technology (FRT), biometrics, and related AI systems (Home Office, 2025). The consultation covers live facial recognition (LFR), retrospective facial recognition (RFR), operator-initiated facial recognition (OIFR), inferential technologies such as emotion and behaviour detection, and object recognition.

    The stated purpose is to give police "sufficient confidence" to use FRT "at significantly greater scale." The proposals include access to passport (58 million+ photographs), immigration (92 million images), and DVLA (52 million records) databases; a new consolidated oversight body; and a tiered authorisation framework. No prohibition on any form of facial recognition is proposed.

    The central finding: the UK is the only major Western democracy expanding mass biometric surveillance while the European Union has prohibited it (European Parliament and Council, 2024), American cities have banned it, and New Zealand, Australia, and Canada have imposed moratoriums or found existing uses unlawful. Parliament has never voted on facial recognition. No statute mentions it by name. The legal framework the consultation proposes to create would validate and entrench what has been built outside democratic oversight.

    Part I: The case for facial recognition

    Honesty requires confronting the strongest arguments for facial recognition technology in policing before examining the case against it. The numbers are not trivial, and the victims behind them are real.

    The arrest record

    Between September 2024 and September 2025, the Metropolitan Police recorded 962 arrests directly resulting from live facial recognition deployments (Metropolitan Police, 2025). These were not minor offences. Over a quarter related to violence against women and girls, including rape, domestic abuse, stalking, and coercive control. Others targeted knife crime, robbery, and individuals wanted on recall to prison.

    During the summer disorder of August 2024, retrospective facial recognition contributed to 127 arrests where suspects had been captured on CCTV but could not be identified through conventional means (Home Office, 2025). Officers ran footage through the Police National Database and matched faces within hours rather than the weeks traditional investigation would require.

    Efficiency gains

    The Home Office consultation document presents economic analysis showing personnel costs per arrest are approximately 25% lower when using LFR compared with traditional policing methods. Average identification time falls from 14 days to minutes. For resource-constrained forces, these are meaningful operational improvements (Home Office, 2025).

    Public opinion

    A Home Office survey of 3,920 respondents found 91% support for facial recognition use in terrorism cases, and roughly two-thirds support for general policing applications (Home Office, 2025). Minister for Policing Sarah Jones has described FRT as "the biggest breakthrough since DNA and fingerprints in catching criminals" (Jones, S., 2025).

    The genuine argument

    Strip away the political rhetoric and the strongest case for facial recognition is narrow and specific: it helps identify individuals already wanted for serious offences who would otherwise evade justice. The rape survivor whose attacker is identified at a football match. The domestic abuse victim whose violent ex-partner is flagged at a train station. The family of a murder victim whose suspect is matched from retrospective CCTV. These are real people with real claims on the state's duty to protect them.

    This article does not dismiss those claims. It argues that the framework proposed by the consultation does not confine facial recognition to this narrow, targeted use; that the political rhetoric has already moved far beyond it; and that the infrastructure being built serves a purpose that its architects have stated openly: comprehensive surveillance of public space. The question is not whether facial recognition can help catch criminals. It can. The question is whether the system being built will stop there.

    Part II: What is actually being proposed

    The Home Office consultation document, published 4 December 2025, contains 17 questions spanning biometric identification, inferential technologies, and object recognition (Home Office, 2025). Understanding what is proposed requires reading what the document says, what it does not say, and what is already happening without legislation.

    Scope: beyond facial recognition

    The consultation explicitly covers technologies that go far beyond matching faces to watchlists. It asks whether the new legal framework should encompass inferential technologies: AI systems that analyse body movements to detect emotions, identify "collapsed or injured persons," or detect "suicidal behaviour" (Home Office, 2025). Object recognition for identifying clothing, bags, vehicles, and other items is also within scope.

    Proposed authorisation tiers

    TierUse caseAuthorisation level
    RoutineStandard policing operations, watchlist matchingSenior officer (Superintendent level)
    SeriousSerious and organised crime, counter-terrorismChief Officer / Assistant Commissioner
    ExceptionalMass event screening, national emergenciesHome Secretary / senior minister

    Critically, no tier requires judicial pre-authorisation. The EU AI Act, by contrast, mandates prior judicial or independent administrative authorisation for any real-time biometric identification in public spaces (European Parliament and Council, 2024).

    The oversight body

    The consultation proposes consolidating the Biometrics and Surveillance Camera Commissioner with the Forensic Science Regulator into a single oversight body. The estimated annual running cost is between £2.2 million and £7.0 million (Home Office, 2025). This body would issue codes of practice, investigate complaints, and request information from law enforcement. It would not have power to pre-authorise or block deployments.

    Database access: the quiet expansion

    Perhaps the most consequential proposal is extending police access beyond custody images to government databases. The numbers are staggering:

    • Passport database: 58 million+ photographs of every UK passport holder
    • Immigration database: approximately 92 million images
    • DVLA records: 52 million driving licence photographs
    • Police National Database: approximately 20 million custody images

    The consultation frames this as a question for public input. But it also reveals that searches of the passport database have already been happening since 2019, increasing from 2 searches in 2020 to 417 in 2023 (Home Office, 2025). The Crime and Policing Bill 2025, Clause 95, would provide statutory authority for DVLA database access (UK Government, 2025). The infrastructure is being built in advance of the law that would authorise it. (For our analysis of how digital identity infrastructure connects to biometric expansion, see our Apple's Digital ID and the Global Identity Infrastructure report.)

    Part IV: The political context

    The expansion under Starmer

    The current trajectory was set in August 2024, when the Starmer government responded to post-Southport disorder by announcing a national violent disorder programme with facial recognition at its centre. Home Secretary Yvette Cooper positioned FRT as a "targeted tool" for catching "serious criminals": those wanted by courts, those who should be returned to prison, and those breaching sexual harm prevention orders (Cooper, Y., 2025).

    In August 2025, the Home Office confirmed the deployment of 10 LFR vans to seven police forces. The current plan: expansion to 50 vans, with permanent camera installations under active consideration (Home Office, 2025).

    Mahmood's Panopticon

    Home Secretary Shabana Mahmood described the government's approach as "the biggest reform to policing in two centuries." In an interview published January 2026, she stated:

    "When I was in justice, my ultimate vision for that part of the criminal justice system was to achieve, by means of AI and technology, what Jeremy Bentham tried to do with his Panopticon. That is that the eyes of the state can be on you at all times."

    — Mahmood, S. (2026), Interview, January 2026

    This is analysed in full in Part IX. The significance here is political: a Home Secretary explicitly endorsing the principle of comprehensive state surveillance, not as a necessary evil constrained by rights, but as a governing aspiration.

    The consultation paradox

    The government is consulting the public on whether to create a legal framework for facial recognition while simultaneously expanding its use. Ten vans became 50 vans before the consultation opened. Passport database searches were already happening before the consultation asked whether they should be permitted. Permanent camera installations are planned before the consultation concludes.

    Minister for Policing Sarah Jones described FRT as "the biggest breakthrough since DNA and fingerprints" (Jones, S., 2025). This is the language of a settled policy decision, not an open consultation.

    The parliamentary opposition

    A cross-party coalition of 65 parliamentarians and 31 organisations has called for an "immediate stop" to facial recognition surveillance (Davis, D. et al., 2025). The coalition includes Conservative MP David Davis, Liberal Democrat leader Sir Ed Davey, and Green MP Caroline Lucas. Baroness Chakrabarti has warned of a "total surveillance society" and identified "challenges to privacy, challenges to freedom of assembly and association, and problems with race and sex discrimination" (Chakrabarti, S., 2025).

    Part V: The technical infrastructure

    Understanding what is being deployed requires examining the specific systems, their capabilities, and their limitations. The technical detail matters because it determines who gets flagged, who gets stopped, and who gets wrongly identified.

    Live Facial Recognition: NEC NeoFace

    UK police forces use NEC's NeoFace Watch system, built on the HD5 Face algorithm (NEC Corporation, 2024). The camera hardware is the Bosch MIC Starlight 7000 HD, capturing at 1080x1920 resolution (Bosch Security Systems, 2024). The system processes live video feeds from cameras mounted on police vans (and increasingly on permanent street infrastructure), extracting facial biometric data and comparing it against a pre-loaded watchlist in real time.

    The operational threshold is set at 0.6 (on a 0 to 1 similarity scale). NPL testing found true positive identification rates between 83% and 93% depending on conditions (NPL, 2023). False positive rates at the operational threshold are approximately 1 in 6,000 for a 10,000-person watchlist and 1 in 60,000 for a 1,000-person watchlist.

    Three deployment modes

    ModeDescriptionDatabaseScale
    LFR (Live)Real-time van/camera scanning against watchlistPre-loaded watchlist7M+ faces scanned in one year
    RFR (Retrospective)CCTV/phone footage matched against PNDPolice National Database (~20M images)50,000+ daily searches
    OIFR (Operator-Initiated)Street-level mobile app, photograph and searchPND + potentially passport/DVLAIndividual officer discretion

    Retrospective system: Cognitec FaceVACS

    The retrospective system used for Police National Database searches runs on Cognitec's FaceVACS-DBScan, version 5.5 (Cognitec Systems, 2024). This is notable: the current release is version 5.9, meaning UK police are using a system multiple iterations behind the vendor's current offering. The NPL report explicitly characterised its findings as "a snapshot of a single version" and cautioned against extrapolating to updated algorithms (NPL, 2025).

    Watchlist growth

    Police watchlists have grown from under 7,000 names in 2022 to over 16,000 in 2025. Senior officers confirmed to the House of Lords Justice and Home Affairs Committee that watchlist selection is based on "crime categories rather than context-specific threat assessment" (House of Lords Justice and Home Affairs Committee, 2025). Inclusion criteria remain broad and poorly defined. Children are not excluded: approximately 1,600 individuals aged 12 to 18 have appeared on watchlists.

    Part VI: National security and AI attack vectors

    The consultation document treats facial recognition as a policing tool. It does not address the national security implications of centralising biometric data from over 200 million photographs into systems accessible by thousands of officers across dozens of police forces. This omission is significant.

    Adversarial attacks

    Facial recognition systems are vulnerable to adversarial manipulation. Published research has demonstrated successful evasion using adversarial patches (printed patterns on clothing or accessories that cause misclassification), specialised makeup patterns that disrupt facial geometry extraction, and infrared LED arrays invisible to the human eye but visible to cameras, which can project false facial features. These are not theoretical: working demonstrations have been published in peer-reviewed venues and at security conferences.

    Database security

    The proposed database integration would create a composite identity graph linking passport photographs, DVLA records, immigration images, and police custody images. A successful breach of any connected system could expose biometric data for the majority of the UK adult population. Unlike passwords, biometric data cannot be reset after compromise.

    The cybersecurity paradox is clear: centralising biometric data increases both utility and catastrophic risk. The more databases are interconnected, the more valuable the system becomes to law enforcement; but the same interconnection makes the system a higher-value target for state-sponsored attackers and increases the blast radius of any breach.

    Supply chain and vendor risk

    NEC Corporation (Japan) is the sole vendor for live facial recognition across UK police forces (NEC Corporation, 2024). Single-vendor dependency creates supply chain risk: training data composition, algorithm updates, and vulnerability patches are controlled by a single foreign entity. NEC has refused to disclose its training data composition, a decision the ICO Deputy Commissioner described as "disappointing" (ICO Deputy Commissioner Keane, S., 2025).

    False flag and identity spoofing

    If adversarial techniques can cause misidentification, they can also cause false identification: spoofing one individual's biometric signature to match another. The implications for both criminal justice and national security are severe. An individual could be framed for presence at a crime scene, or an intelligence target could evade detection by triggering a false match to a benign identity.

    The predictive policing trajectory

    In April 2026, the Home Office will deploy a predictive policing AI prototype, funded with £4 million in public money (Home Office, 2025). The integration of facial recognition infrastructure with predictive analytics creates a system that does not merely identify known suspects but attempts to predict future criminal behaviour. The Crime and Policing Bill 2025 includes provisions for face covering bans at protests and "respect orders" that could be enforced through automated recognition (UK Government, 2025). The trajectory from identification to prediction to pre-emptive intervention is visible in the policy pipeline.

    Part VII: The bias problem

    The technical evidence on bias is extensive and consistent. It does not support the claim that facial recognition technology is ready for equitable deployment at scale.

    The 138-fold disparity

    The National Physical Laboratory's December 2025 retrospective report on the Cognitec algorithm used for PND searches found dramatic demographic disparities in false positive rates (NPL, 2025):

    Demographic groupFalse positive rateRelative to white baseline
    White0.04%Baseline
    Asian4.0%100x
    Black5.5%138x
    Black female9.9%248x

    This algorithm processes over 25,000 searches monthly across all police forces. A 5.5% false positive rate for Black subjects means that for every 1,000 searches involving a Black person's face, 55 will generate an incorrect match. For Black women, the figure is 99 per 1,000.

    Live system performance

    The NEC algorithm used for live facial recognition showed no statistically significant demographic bias at the operational threshold of 0.6 (NPL, 2023). However, bias emerged at lower threshold settings. This is important because operational thresholds are set by individual forces, not by statute, and can be adjusted without external oversight.

    Metropolitan Police data from 2025 revealed that 80% of innocent people wrongly flagged by live facial recognition were Black (Metropolitan Police, 2025). Big Brother Watch's investigation found that 73.5% of all police LFR "matches" have been false positives: innocent people wrongly identified as suspects (Big Brother Watch, 2025).

    The Thompson case

    Shaun Thompson, a Black anti-knife crime community worker, was wrongly flagged by facial recognition outside London Bridge station in February 2024. He was detained for 30 minutes and had his fingerprints demanded despite producing multiple forms of identification (Big Brother Watch, 2026). Thompson has said the experience was humiliating and frightening. He is now a co-claimant in Thompson & Carlo v Metropolitan Police, heard in January 2026 with judgment pending.

    The Equality and Human Rights Commission has intervened in the case, arguing that the Metropolitan Police's LFR policy is incompatible with Articles 8, 10, and 11 of the European Convention on Human Rights (Equality and Human Rights Commission, 2025).

    Training data opacity

    NEC has refused to disclose the composition of its training data: the images used to develop the algorithm that decides who gets flagged (ICO Deputy Commissioner Keane, S., 2025). Without transparency on training data, independent assessment of bias sources is impossible. The algorithm is a black box processing millions of faces with documented racial disparities, built on data the public is not permitted to examine. The disproportionate impact on communities already subject to over-policing is a critical concern for anyone exercising their right to protest — see our Protest Privacy Guide for practical protective measures.

    Part VIII: Civil society and regulator responses

    The breadth and intensity of opposition from civil society, regulators, and academic institutions is unprecedented for a UK policing technology.

    Part IX: Mahmood's Panopticon

    When a Home Secretary invokes Jeremy Bentham's Panopticon as a model for government policy, the intellectual history demands examination. This is not a metaphor deployed by critics. It is a framework chosen by the architect of the policy herself.

    Bentham's design (1787)

    Jeremy Bentham's Panopticon, proposed in 1787, was a circular prison design built around a central observation tower (Bentham, J., 1787). The cells were arranged around the circumference, each visible from the tower but unable to see into it. The guards could observe any inmate at any time, but the inmates could never know whether they were being watched at any given moment. The architectural logic was precise: if prisoners must assume they are always observed, they regulate their own behaviour without the need for constant physical coercion. Surveillance becomes self-enforcing. The power of the system lies not in watching everyone all the time (which is impossible) but in making everyone believe they could be watched at any time (which is merely architectural).

    Why the Panopticon is a prison

    The Panopticon was designed for prisoners: people convicted of crimes, stripped of liberty by due process. It operates by eliminating the distinction between the watched and the unwatched; everyone is presumed subject to observation. Applying this model to public streets means treating every citizen as a prisoner: presumed subject to state scrutiny at all times, with no prior suspicion required. The presumption of innocence is reversed. The burden shifts from the state proving cause to observe you, to you proving you have nothing to hide. This is not a surveillance debate; it is a constitutional one. A Home Secretary modelling public space on a prison is proposing that the relationship between citizen and state should mirror the relationship between inmate and guard.

    Mahmood's invocation

    What makes Mahmood's statement extraordinary is the deliberate adoption of the panopticon not as critique but as aspiration (Mahmood, S., 2026). Government ministers typically describe facial recognition as a "targeted tool" used against "serious criminals." Mahmood's formulation abandons that framing entirely. "The eyes of the state can be on you at all times" is not a description of targeted policing. It is a description of comprehensive surveillance: universal, persistent, and explicitly modelled on a prison.

    The shift from metaphor to stated policy is significant. When civil liberties organisations compare surveillance to the Panopticon, they are making an analogy. When a Home Secretary says her goal is to achieve what Bentham tried to do, she is making a policy statement. A design conceived as a mechanism for controlling prisoners has been received by the policy class as a blueprint for governing citizens.

    From metaphor to mechanism

    Facial recognition technology is the technological realisation of what Bentham imagined. Van-mounted cameras scanning every face in a crowd. Permanent installations on street infrastructure. OIFR apps allowing any officer to photograph any person and search them against 200 million images. The conditions of the Panopticon are no longer architectural; they are computational. The tower is a server rack. The cells are public streets. The principle is identical: you do not know when you are being watched, but you must assume you always are.

    The chilling effect

    Empirical research confirms that awareness of surveillance changes behaviour. The Penney Wikipedia Study found a "statistically significant immediate decline in traffic" to terrorism-related articles following the Snowden revelations: evidence of self-censorship in response to surveillance awareness (Penney, J., 2016). PEN America's survey of 520 American writers found widespread self-censorship on subjects including "military affairs, the Middle East North Africa region, mass incarceration, drug policies" (PEN America, 2015).

    The European Court of Human Rights, in Glukhin v Russia, explicitly recognised that "the use of highly intrusive facial recognition technology to identify and arrest participants of peaceful protest actions could have a chilling effect in regard of the rights to freedom of expression and assembly" (European Court of Human Rights, 2023).

    Russia as cautionary tale

    Russia deployed facial recognition for protest surveillance; mass protests "practically disappeared" after deployment (European Court of Human Rights, 2023). This is the panopticon functioning as designed: not through the arrest of every protester, but through the knowledge that attendance at a protest means biometric identification and potential consequences. The surveillance does not need to be comprehensive to be effective. It needs to be believed to be comprehensive.

    Part X: Britain stands alone among democracies

    The UK's position as an outlier is not a rhetorical claim. It is empirically observable across the democratic world.

    JurisdictionPolicy on public FRTStatus
    European UnionAI Act prohibits real-time public biometric ID; judicial pre-authorisation required for exceptions
    Prohibited
    San Francisco + 17 US citiesMunicipal bans on government FRT use
    Banned
    New ZealandVoluntary moratorium pending independent expert review
    Moratorium
    AustraliaPrivacy Commissioner ruled commercial FRT use violated privacy laws
    Restricted
    CanadaPrivacy Commissioner found RCMP Clearview AI use unlawful
    Found unlawful
    United KingdomActive expansion: 50 vans, permanent cameras planned, 200M+ image database access
    Expanding
    China200-500M CCTV cameras integrated with social credit system
    Mass deployment
    RussiaExpanded from 5 to 62 regions since Ukraine invasion; protest identification
    Mass deployment

    The Ada Lovelace Institute warned that the UK risks becoming a "regulatory sandbox" for surveillance technologies: a jurisdiction where companies and state agencies can deploy systems that would be prohibited in the EU or restricted in comparable democracies (Ada Lovelace Institute, 2025).

    The trajectory is clear. Among democracies, the UK is not merely failing to restrict facial recognition; it is actively expanding it. The only states pursuing comparable or greater expansion are authoritarian regimes.

    Part XI: How to respond

    The consultation closes at 11:59pm on 12 February 2026. The Home Office has indicated it gives more weight to unique, personalised responses than template submissions. What follows are the tools and arguments to make your response count.

    Responding to the consultation

    Submit your response

    Email: fr-consultation@homeoffice.gov.uk

    Post: Data & Identity Directorate, 2 Marsham Street, 1st Floor Peel Building, London SW1P 4DF

    Consultation response tool: Big Brother Watch provides template responses you can personalise at bigbrotherwatch.org.uk/campaigns/stop-facial-recognition/

    Key arguments to include

    • Over 7 million innocent people were scanned by police facial recognition in England and Wales in one year. This is mass surveillance, not targeted policing.
    • 80% of misidentifications by the Metropolitan Police affected Black individuals, demonstrating systemic racial discrimination (Metropolitan Police, 2025).
    • No laws specifically mention facial recognition; Parliament has never debated or approved its use.
    • The UK is the only major democracy expanding this technology while the EU has prohibited it (European Parliament and Council, 2024).
    • The Equality and Human Rights Commission believes current Metropolitan Police use is unlawful (Equality and Human Rights Commission, 2025).
    • The consultation proposes enabling searches of passport and immigration databases containing over 150 million photographs of people never suspected of any crime.

    Positions to advocate

    • A complete ban on Live Facial Recognition in public spaces
    • If LFR is permitted: safeguards matching the EU AI Act, including warrant requirements, limitation to specified serious crimes only, and prior judicial authorisation
    • Strict limitations on retrospective facial recognition using only lawfully held custody images
    • Prohibition of operator-initiated facial recognition (OIFR)
    • Independent pre-authorisation for any deployment, not merely post-hoc oversight
    • Exclusion of passport, immigration, and DVLA databases from police searches

    Write to your MP

    Find your MP: writetothem.com or members.parliament.uk

    Privacy International tool: privacyinternational.org provides a dedicated MP letter-writing tool for facial recognition concerns.

    Template opening (customise with your own experience):

    "I am writing as your constituent to raise serious concerns about the rapid expansion of live facial recognition technology by police forces without specific legislation. The Home Office consultation closes 12 February 2026. I urge you to scrutinise the proposed framework, question whether mass biometric surveillance should be permitted at all, and request information on whether this technology is being used in our constituency."

    Key points to raise with your MP:

    • No legislation governs live facial recognition; police are writing their own rules without parliamentary oversight
    • The EHRC has intervened in legal proceedings believing Met Police use is unlawful
    • Request your MP table parliamentary questions to the Home Secretary about what safeguards will be mandated
    • Ask whether facial recognition is being deployed in your constituency
    • Reference the cross-party coalition of 65 MPs already calling for a halt (Davis, D. et al., 2025)

    Active campaigns and legal challenges

    • Thompson & Carlo v Metropolitan Police: heard January 2026, judgment pending. BBW faces £70,000 in potential costs if unsuccessful. Crowdfunding at crowdfunder.co.uk/p/stop-facial-recognition-surveillance (Big Brother Watch, 2026).
    • 38 Degrees/Big Brother Watch petition: 54,000+ signatures calling on the Home Secretary and Met Commissioner to stop LFR (38 Degrees and Big Brother Watch, 2025).
    • Liberty petition: 80,000+ signatures opposing facial recognition at libertyhumanrights.org.uk (Liberty, 2025).
    • CrowdJustice "Face Off" fundraiser: supporting the legal challenge against the Metropolitan Police.

    Related reading on TheVPNMatrix.com

    Who to follow

    Organisations

    Big Brother Watch

    Campaign lead, legal challenges, consultation response tool

    bigbrotherwatch.org.uk

    Liberty UK

    Bridges case victory, ongoing advocacy, petition

    libertyhumanrights.org.uk

    Privacy International

    MP letter tool, international surveillance expertise

    privacyinternational.org

    Electronic Frontier Foundation

    Global facial recognition ban advocacy, technical analysis

    eff.org

    Ada Lovelace Institute

    Research and policy analysis on biometric governance

    adalovelaceinstitute.org

    StopWatch

    Racial disproportionality in policing and FRT

    stop-watch.org

    Open Rights Group

    Digital rights advocacy, Police Scotland monitoring

    openrightsgroup.org

    Key people

    • Silkie Carlo: Director, Big Brother Watch; co-claimant in Thompson & Carlo v Met Police
    • Matthew Feeney: Head of Tech and Innovation, Centre for Policy Studies
    • Ruth Ehrlich: Head of Policy, Liberty UK
    • Baroness Shami Chakrabarti: Former Liberty director; vocal House of Lords critic
    • Prof. Peter Fussey: University of Essex; led only independent UK police LFR evaluation
    • Francesca Whitelaw KC: Interim Biometrics and Surveillance Camera Commissioner

    Parliamentary

    The 65 signatories to the cross-party call for a halt represent a significant bloc. Key figures include David Davis (Conservative), Sir Ed Davey (Liberal Democrat leader), and Caroline Lucas (Green) (Davis, D. et al., 2025).

    Key takeaways

    1

    The consultation is not neutral. The government is consulting on a framework for expansion, not on whether facial recognition should be used. Expansion is already underway: 50 vans, permanent cameras planned, passport database already searched.

    2

    No democratic mandate exists. Parliament has never voted on facial recognition. No statute mentions it. The legal framework consists of case law, data protection rules not designed for biometric surveillance, and non-binding guidance.

    3

    Racial bias is documented and severe. A 138-fold disparity in false positive rates between white and Black subjects in retrospective searches. 80% of wrongful live identifications affecting Black individuals. 73.5% of all LFR matches being false positives.

    4

    The UK is an international outlier. The only major Western democracy expanding mass biometric surveillance while the EU prohibits it, American cities ban it, and comparable democracies restrict or moratorium it.

    5

    The Home Secretary has said the quiet part out loud. Mahmood's explicit invocation of the Panopticon as policy aspiration transforms the debate from whether mass surveillance is an unintended consequence to whether it is the intended design.

    6

    The consultation closes 12 February 2026. Email fr-consultation@homeoffice.gov.uk. Write to your MP at writetothem.com. Support the legal challenges. Sign the petitions. The window for democratic input is narrow and closing.

    References

    1. [1]38 Degrees and Big Brother Watch (2025) 'Petition: Stop the Met Police Using Facial Recognition Surveillance', 38 Degrees. Available at: https://you.38degrees.org.uk/petitions/stop-the-met-police-using-facial-recognition-surveillance (Accessed: 1 February 2026).
    2. [2]Ada Lovelace Institute (2025) 'Regulating Biometrics: Facial Recognition Technology in Law Enforcement', Ada Lovelace Institute. Available at: https://www.adalovelaceinstitute.org/report/regulating-biometrics/ (Accessed: 1 February 2026).
    3. [3]Amnesty International et al. (2025) 'Joint Statement: Coalition of 130 Civil Society Groups Calling for Halt to Facial Recognition', Amnesty International. Available at: https://www.amnesty.org.uk/facial-recognition-coalition-statement (Accessed: 1 February 2026).
    4. [4]Bentham, J. (1787) 'Panopticon; or, The Inspection-House', Dublin: Thomas Byrne. Available at: https://www.ucl.ac.uk/bentham-project/publications/panopticon (Accessed: 1 February 2026).
    5. [5]Big Brother Watch (2025) 'Stop Facial Recognition Surveillance Campaign', Big Brother Watch. Available at: https://bigbrotherwatch.org.uk/campaigns/stop-facial-recognition/ (Accessed: 1 February 2026).
    6. [6]Big Brother Watch (2026) 'Thompson & Carlo v Metropolitan Police: Legal Challenge Update', Big Brother Watch. Available at: https://bigbrotherwatch.org.uk/campaigns/stop-facial-recognition/legal-challenge/ (Accessed: 1 February 2026).
    7. [7]Big Brother Watch (2025) 'Face Off: The Lawless Growth of Facial Recognition in UK Policing', Big Brother Watch. Available at: https://bigbrotherwatch.org.uk/wp-content/uploads/2022/05/Face-Off-final-1.pdf (Accessed: 1 February 2026).
    8. [8]Bosch Security Systems (2024) 'MIC Starlight 7000 HD Camera Specifications', Bosch Security and Safety Systems. Available at: https://www.boschsecurity.com/xc/en/products/cameras/mic-cameras/ (Accessed: 1 February 2026).
    9. [9]Chakrabarti, S. (2025) 'Statement on Facial Recognition Expansion', House of Lords. Available at: https://hansard.parliament.uk/lords/ (Accessed: 1 February 2026).
    10. [10]Cognitec Systems (2024) 'FaceVACS-DBScan: Large-Scale Face Identification', Cognitec Systems GmbH. Available at: https://www.cognitec.com/facevacs-dbscan.html (Accessed: 1 February 2026).
    11. [11]Cooper, Y. (2025) 'Statement on Facial Recognition in Policing', Hansard. Available at: https://hansard.parliament.uk/ (Accessed: 1 February 2026).
    12. [12]Court of Appeal (2020) 'R (Bridges) v Chief Constable of South Wales Police [2020] EWCA Civ 1058', England and Wales Court of Appeal. Available at: https://www.judiciary.uk/judgments/r-bridges-v-chief-constable-of-south-wales-police/ (Accessed: 1 February 2026).
    13. [13]Davis, D. et al. (2025) 'Cross-Party Parliamentary Letter Calling for Halt to Facial Recognition', UK Parliament. Available at: https://www.parliament.uk/ (Accessed: 1 February 2026).
    14. [14]Electronic Frontier Foundation (2022) 'Ban Government Use of Face Recognition in the UK', Electronic Frontier Foundation. Available at: https://www.eff.org/deeplinks/2022/02/ban-government-use-face-recognition-uk (Accessed: 1 February 2026).
    15. [15]Electronic Frontier Foundation (2025) 'Police Use of Face Recognition Continues to Wrack Up Real-World Harms', Electronic Frontier Foundation. Available at: https://www.eff.org/deeplinks/2025/01/police-use-face-recognition-harms (Accessed: 1 February 2026).
    16. [16]Equality and Human Rights Commission (2025) 'Intervention in Thompson & Carlo v Metropolitan Police', EHRC. Available at: https://www.equalityhumanrights.com/our-work/news/facial-recognition-legal-intervention (Accessed: 1 February 2026).
    17. [17]European Court of Human Rights (2023) 'Glukhin v Russia (Application No. 11519/20)', ECHR. Available at: https://hudoc.echr.coe.int/eng?i=001-226065 (Accessed: 1 February 2026).
    18. [18]European Parliament and Council (2024) 'Regulation (EU) 2024/1689: The Artificial Intelligence Act', Official Journal of the European Union. Available at: https://eur-lex.europa.eu/eli/reg/2024/1689/oj (Accessed: 1 February 2026).
    19. [19]Freedom News (2025) 'How the UK Is Shaping a Future of Precrime', Freedom News. Available at: https://freedomnews.org.uk/2025/uk-precrime-facial-recognition/ (Accessed: 1 February 2026).
    20. [20]Fussey, P. and Murray, D. (2019) 'Independent Report on the London Metropolitan Police Service's Trial of Live Facial Recognition Technology', University of Essex Human Rights Centre. Available at: https://repository.essex.ac.uk/24946/ (Accessed: 1 February 2026).
    21. [21]Home Office (2025) 'Consultation on a Legal Framework for the Use of Facial Recognition Technology and Biometrics in Law Enforcement', GOV.UK. Available at: https://www.gov.uk/government/consultations/legal-framework-for-using-facial-recognition-in-law-enforcement (Accessed: 1 February 2026).
    22. [22]Home Office (2025) 'Public Attitudes to Facial Recognition Technology Survey (N=3,920)', GOV.UK. Available at: https://www.gov.uk/government/publications/public-attitudes-facial-recognition-survey (Accessed: 1 February 2026).
    23. [23]Home Office (2025) 'Predictive Policing AI Prototype Funding Announcement', GOV.UK. Available at: https://www.gov.uk/government/news/predictive-policing-prototype (Accessed: 1 February 2026).
    24. [24]House of Lords Justice and Home Affairs Committee (2025) 'Evidence Session: Facial Recognition Technology in Policing', UK Parliament. Available at: https://committees.parliament.uk/committee/640/justice-and-home-affairs-committee/ (Accessed: 1 February 2026).
    25. [25]ICO (2025) 'Guidance on Data Protection and Live Facial Recognition Technology', Information Commissioner's Office. Available at: https://ico.org.uk/for-organisations/uk-gdpr-guidance-and-resources/facial-recognition-technology/ (Accessed: 1 February 2026).
    26. [26]ICO (2025) 'Statement on NPL Facial Recognition Technology Report', Information Commissioner's Office. Available at: https://ico.org.uk/about-the-ico/media-centre/news-and-blogs/2025/facial-recognition-npl-report/ (Accessed: 1 February 2026).
    27. [27]ICO Deputy Commissioner Keane, S. (2025) 'Response to NEC Training Data Non-Disclosure', Information Commissioner's Office. Available at: https://ico.org.uk/ (Accessed: 1 February 2026).
    28. [28]Jones, S. (2025) 'Statement on the Facial Recognition Consultation Launch', Home Office. Available at: https://www.gov.uk/government/news/new-legal-framework-for-facial-recognition-technology (Accessed: 1 February 2026).
    29. [29]Liberty (2025) 'Response to the Home Secretary's Announcement on Facial Recognition Expansion', Liberty. Available at: https://www.libertyhumanrights.org.uk/issue/facial-recognition/ (Accessed: 1 February 2026).
    30. [30]Mahmood, S. (2026) 'Interview: Home Secretary on Criminal Justice and Technology', Published interview, January 2026. Available at: https://www.gov.uk/government/people/shabana-mahmood (Accessed: 1 February 2026).
    31. [31]Metropolitan Police (2025) 'Live Facial Recognition Deployment Data 2024-2025', Metropolitan Police. Available at: https://www.met.police.uk/advice/advice-and-information/facial-recognition/live-facial-recognition/ (Accessed: 1 February 2026).
    32. [32]NEC Corporation (2024) 'NeoFace Watch: Live Facial Recognition for Public Safety', NEC Corporation. Available at: https://www.nec.com/en/global/solutions/biometrics/face/neoface-watch.html (Accessed: 1 February 2026).
    33. [33]NPL (2023) 'Facial Recognition Technology in Law Enforcement: Equitability Study', National Physical Laboratory. Available at: https://www.npl.co.uk/facial-recognition-equitability (Accessed: 1 February 2026).
    34. [34]NPL (2025) 'Retrospective Facial Recognition Equitability Report', National Physical Laboratory. Available at: https://www.npl.co.uk/facial-recognition-retrospective-report-2025 (Accessed: 1 February 2026).
    35. [35]PEN America (2015) 'Global Chilling: The Impact of Mass Surveillance on International Writers', PEN America. Available at: https://pen.org/research-resources/global-chilling/ (Accessed: 1 February 2026).
    36. [36]Penney, J. (2016) 'Chilling Effects: Online Surveillance and Wikipedia Use', Berkeley Technology Law Journal, 31(1), pp. 117-182. Available at: https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2769645 (Accessed: 1 February 2026).
    37. [37]Privacy International (2025) 'Toward Regulation: Facial Recognition in UK Policing', Privacy International. Available at: https://privacyinternational.org/long-read/facial-recognition-uk-policing (Accessed: 1 February 2026).
    38. [38]StopWatch (2025) 'Facial Recognition and Racial Disproportionality in Policing', StopWatch. Available at: https://www.stop-watch.org/research-policy/facial-recognition/ (Accessed: 1 February 2026).
    39. [39]UK Government (2025) 'Crime and Policing Bill 2025', UK Parliament. Available at: https://bills.parliament.uk/bills/3731 (Accessed: 1 February 2026).

    Surfshark

    Best value VPN with unlimited devices

    Get Deal

    Cookie Preferences

    We use essential cookies for site functionality. Our analytics are cookie-free and don't require consent.

    Learn more
    Questions or concerns?

    Contact us via X, Substack, or see our Cookie Policy for full details.