Essex Police Pause Facial Recognition Use After Bias Study

ArchivedConflicting Facts
  • March 20, 2026 at 12:14 PM ET
  • Est. Read: 2 Mins
Essex Police Pause Facial Recognition Use After Bias StudyAI-generated illustration — does not depict real events
Listen to This SummaryAI-generated audio

Key Takeaways

Essex Police have paused using live facial recognition (LFR) technology after studies found it was more likely to correctly identify black people than other ethnic groups.

  • Essex Police paused LFR use following a study identifying potential bias.
  • The system correctly identified about half of watchlist targets but showed racial disparities.
  • Campaign group Big Brother Watch criticized the technology as authoritarian and ineffective.
  • Essex Police plan to resume LFR deployments after updating software and policies.

Essex Police have temporarily halted the use of live facial recognition (LFR) cameras following a study that revealed potential racial bias in the system's identification rates. According to reports from BBC, The Guardian, and Sky News, the study found that the technology was statistically more likely to correctly identify black people than participants from other ethnic groups.

The pause comes after Essex Police commissioned two independent studies through the University of Cambridge. One study involved 188 volunteers acting as members of the public during a real police deployment, while another analyzed over 40 deployments between August 2024 and February 2025. The latter found that the system scanned approximately 1.3 million faces in public spaces, leading to 48 arrests and one confirmed mistaken intervention.

Essex Police stated they decided to pause deployments while working with the algorithm software provider to review the results and update the software. They have since revised their policies and procedures and expressed confidence in resuming LFR use as part of policing operations. The Home Office announced plans to increase the number of LFR vans from 10 to 50, despite criticism from campaign groups like Big Brother Watch, which described the technology as "authoritarian, inaccurate and ineffective."

The Information Commissioner’s Office (ICO) has warned other police forces using similar technology to have mitigations in place. The ICO also emphasized the importance of routine testing for bias and discriminatory outcomes to prevent unfairness. As the government looks to expand the use of LFR, questions over privacy and the vast number of images taken remain key concerns.

How this summary was created

This summary synthesizes reporting from 4 independent publishers using AI. All sources are cited and linked below. NewsBalance is a news aggregator and media literacy tool, not a news publisher. AI-generated content may contain errors or inaccuracies — always verify important information with the original sources.

Read our full methodology →

Read the original reporting ↓