Beautiful Virgin Islands

Friday, May 15, 2026

Uber faces legal action over 'racist' facial recognition software

Uber faces legal action over 'racist' facial recognition software

A Black ex-Uber driver claims he was fired after the company's automated software failed to recognise him.

Uber is facing legal action for alleged indirect racial discrimination against a driver who claims he was sacked after facial recognition software used by the company failed to recognise him.

In an employment tribunal claim filed this week, the Black driver, who has asked not to be named, alleges that Uber's British subsidiary deactivated his account after failing to recognise him in two separate photographs, leaving him unable to work.

The Independent Workers' Union of Great Britain (IWGB), which filed the claim on the driver's behalf, told Euronews Next that it had been able to verify at least 35 similar dismissals among its members since the start of the COVID-19 pandemic, but warned that "hundreds if not thousands more" could be affected.

The IWGB is calling for Uber to stop using its "racist algorithm" and reinstate drivers unfairly dismissed as a result of the software's alleged mistakes.

Human backup


In a statement, Uber said that its facial recognition software was "designed to protect the safety and security of everyone who uses the Uber app by helping ensure the correct driver is behind the wheel".

The company said that the system includes "robust human review to make sure that this algorithm is not making decisions about someone's livelihood in a vacuum, without oversight".

In the employment tribunal claim seen by Euronews Next, the driver, who worked for Uber from 2016 until being dismissed last April, alleges that he was never offered the option of a manual photo check.

Uber has used Real-Time ID Check in the UK since April 2020, after London transport regulator TfL raised concerns about the safety of the company's passengers.

The Microsoft-made software works by comparing a selfie taken by the driver as they start work to a photo the company has on file. It says all drivers can opt for either automated checks via an algorithm or manual checks by humans.

A help page on Uber's website claims that in the event Real-Time ID Check cannot verify a driver's photo, both images will be sent to a "specialist team" who will manually verify the driver's identity.

Euronews Next asked Uber if it handles the specialist identity checks itself, but the company did not respond by the time of publication.

Are algorithms racist?


Racial bias has long been an issue highlighted by studies of facial recognition technologies.

A 2018 paper by computer scientists Joy Buolamwini and Timnit Gebru found that facial recognition technologies they studied - including Microsoft's - performed better with lighter skin types.

Every tech they reviewed performed worst with darker-skinned, female faces, a result repeated by an independent 2019 study of facial recognition technologies by the National Institute of Standards and Technology.

Microsoft president Brad Smith wrote in a 2018 blog post that "especially in its current state of development, certain uses of facial recognition technology increase the risk of decisions and, more generally, outcomes that are biased and, in some cases, in violation of laws prohibiting discrimination".

The risk of bias is particularly relevant in the case of Uber drivers in the UK.

A December 2020 TfL survey of private hire drivers in London found that over three-quarters of respondents who gave an answer were Black or Black British, Asian or Asian British, or of mixed race.

"Hundreds of drivers and couriers who served through the pandemic have lost their jobs without any due process or evidence of wrongdoing and this reflects the larger culture at Uber which treats its majority-BAME workers as disposable,” said Henry Chango Lopez, general secretary of the IWGB.

"Uber must urgently scrap this racist algorithm and reinstate all the drivers it has unfairly terminated".

Black Lives Matter UK, which is also supporting the case, said: "The gig economy which already creates immense precarity for Black key workers is now further exacerbated by this software that prevents them from working at all, purely based on the colour of their skin. Racist practices such as these must come to an end".

Newsletter

Related Articles

Beautiful Virgin Islands
0:00
0:00
Close
The Great Western Exit: Why Best Citizens Are Fleeing the Rich World [PODCAST]
The New Robber Barons of Intelligence: Are AI Bosses More Powerful Than Rockefeller?
The End of the Old Order [Podcast]
Britain’s Democracy Is Now a Costume
The AI Gold Rush Is Coming for America’s Last Open Spaces [Podcast]
The Pentagon’s AI Squeeze: Eight Tech Giants Get In, Anthropic Gets Shut Out [Podcast]
The War Map: Professor Jiang’s Dark Theory of Iran, Trump, China, Russia, Israel, and the Coming Global Shock [Podcast]
Labour Is No Longer a National Party [Podcast]
AI Isn’t Stealing Your Job. It’s Dismantling It Piece by Piece.
Lawyers vs Engineers: Why China Builds While America Litigates [Podcast]
Churchill’s Glass: The Drunk, the Doctor, and the Myth Britain Refuses to Sober Up From
Apple issues an unusual warning: this is how your iPhone can be hacked without you doing anything
The Met Gala Meets the Age of Billionaire Backlash
Russian Oligarch’s Superyacht Crosses Hormuz via Iran-Controlled Route
Gunfire Disrupts White House Correspondents’ Dinner as Trump Is Evacuated
A Leak, a King, and a Fracturing Alliance
Inside the Gates Foundation Turmoil: Layoffs, Scrutiny, and the Cost of Reputational Risk
UK Biobank Breach Exposes Health Data of 500,000, Listed for Sale on Chinese Platform
KPMG Cuts Around 10% of US Audit Partners After Failed Exit Push
French Police Probe Suspected Weather-Data Tampering After Unusual Polymarket Bets on Paris Temperatures
News Roundup
Microsoft lost 2.5 millions users (French government) to Linux
Privacy Problems in Microsoft Windows OS
News roundup
Péter András Magyar and the Strategic Reset of Hungary
Hungary After the Landslide — A Strategic Reset in Europe
×