UK data-regulator demands urgent clarity on racial bias in police facial-recognition systems
New tests reveal false-positive rates far higher for Black and Asian individuals, prompting review and possible overhaul of UK law-enforcement biometric use
The United Kingdom’s independent data-protection authority has called for “urgent clarity” from the government after recent testing found that facial-recognition tools used by police disproportionately misidentify Black and Asian individuals compared with white subjects.
The call comes in the wake of a report by a national scientific lab showing error rates up to nearly ten times higher among some minority groups.
The newly published evaluation indicated that the false positive identification rate for Black women reached 9.9 per cent under certain settings — compared with as low as 0.04 per cent for white subjects.
False positives for Asian individuals and Black men were also significantly elevated.
The results apply to the retrospective facial-recognition technology run against the national police database, which authorities have used to identify suspects based on photographs.
In response to the findings, the data regulator said the police must provide an urgent explanation of how it will safeguard civil rights and avoid unlawful discrimination.
It emphasised the need for transparency around both the scientific accuracy of the technology and its deployment in public-facing policing.
Enforcement action or legally binding enforcement measures were indicated as potential responses, depending on what assurances are forthcoming.
The government, for its part, acknowledged the bias problem and said a new algorithm with “no statistically significant bias” is already under testing.
It added that it has asked the national police inspectorate and the forensic science regulator to conduct a review of law-enforcement use of biometric tools.
Policymakers also point to a recently launched public consultation aimed at drafting a comprehensive legal framework for facial recognition and related biometrics.
Supporters of the technology — including senior figures in law enforcement — continue to frame facial recognition as a powerful means to catch criminals quickly.
National-level deployment is already underway, with camera-equipped police vans and retrospective suspect-matching used in investigations.
Ministers have described the technology as a major breakthrough for tackling crime.
Yet civil-liberties campaigners and legal experts warn that the revealed disparities reflect structural hazards, particularly for ethnic minorities.
They argue that broad deployment before securing robust, enforceable safeguards risks entrenching unfair surveillance and undermining public confidence in policing.
As the consultation proceeds and regulators weigh their next steps, the future use of facial recognition by UK police now stands at a critical juncture where civil rights, scientific validity and public safety are all weighing heavily on the outcome.