Beautiful Virgin Islands

Wednesday, May 13, 2026

Apple walks back plans for new child safety tools after privacy backlash

Apple walks back plans for new child safety tools after privacy backlash

Apple made headlines - and not the good kind - last month when it announced a test of a new tool aimed at combating child exploitation. Critics quickly decried the feature's potential privacy implications, and now Apple is taking a long pit stop before moving forward with its plans.
On Friday, the company said it will pause testing the tool in order to gather more feedback and make improvements.

The plan centers on a new system that will, if it is eventually launched, check iOS devices and iCloud photos for child abuse imagery. It includes a new opt-in feature that would warn minors and their parents of sexually explicit incoming or sent image attachments in iMessage and blur them.

Apple's announcement last month that it would begin testing the tool fit with a recent increased focus on protecting children among tech companies — but it was light on specific details and was swiftly met with outraged tweets, critical headlines and calls for more information.

So on Friday, Apple (AAPL) said it would put the brakes on implementing the features.

"Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material," the company said. "Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features."

In a series of press calls aiming to explain the planned tool last month, Apple stressed that consumers' privacy would be protected because the tool would turn photos on iPhones and iPads into unreadable hashes, or complex numbers, stored on user devices. Those numbers would be matched against a database of hashes provided by the National Center for Missing and Exploited Children (NCMEC) once the pictures were uploaded to Apple's iCloud storage service. (Apple later said other organizations would be involved in addition to NCMEC.)

Only after a certain number of hashes matched the NCMEC's photos, Apple's review team would be alerted so that it could decrypt the information, disable the user's account and alert NCMEC, which could inform law enforcement about the existence of potentially abusive images.

Many child safety and security experts praised the intent of the plan, recognizing the ethical responsibilities and obligations a company has over the products and services it creates. But they also said the efforts presented potential privacy concerns.

"When people hear that Apple is 'searching' for child sexual abuse materials (CSAM) on end user phones they immediately jump to thoughts of Big Brother and '1984,'" Ryan O'Leary, research manager of privacy and legal technology at market research firm IDC, told CNN Business last month. "This is a very nuanced issue and one that on its face can seem quite scary or intrusive."

Critics of the plan applauded Apple's decision to pause the test.

Digital rights group Fight for the Future called the tool a threat to "privacy, security, democracy, and freedom," and called on Apple to shelve it permanently.

"Apple's plan to conduct on-device scanning of photos and messages is one of the most dangerous proposals from any tech company in modern history," Fight for the Future Director Evan Greer said in a statement. "Technologically, this is the equivalent of installing malware on millions of people's devices — malware that can be easily abused to do enormous harm."
Newsletter

Related Articles

Beautiful Virgin Islands
0:00
0:00
Close
The Great Western Exit: Why Best Citizens Are Fleeing the Rich World [PODCAST]
The New Robber Barons of Intelligence: Are AI Bosses More Powerful Than Rockefeller?
The End of the Old Order [Podcast]
Britain’s Democracy Is Now a Costume
The AI Gold Rush Is Coming for America’s Last Open Spaces [Podcast]
The Pentagon’s AI Squeeze: Eight Tech Giants Get In, Anthropic Gets Shut Out [Podcast]
The War Map: Professor Jiang’s Dark Theory of Iran, Trump, China, Russia, Israel, and the Coming Global Shock [Podcast]
Labour Is No Longer a National Party [Podcast]
AI Isn’t Stealing Your Job. It’s Dismantling It Piece by Piece.
Lawyers vs Engineers: Why China Builds While America Litigates [Podcast]
Churchill’s Glass: The Drunk, the Doctor, and the Myth Britain Refuses to Sober Up From
Apple issues an unusual warning: this is how your iPhone can be hacked without you doing anything
The Met Gala Meets the Age of Billionaire Backlash
Russian Oligarch’s Superyacht Crosses Hormuz via Iran-Controlled Route
Gunfire Disrupts White House Correspondents’ Dinner as Trump Is Evacuated
A Leak, a King, and a Fracturing Alliance
Inside the Gates Foundation Turmoil: Layoffs, Scrutiny, and the Cost of Reputational Risk
UK Biobank Breach Exposes Health Data of 500,000, Listed for Sale on Chinese Platform
KPMG Cuts Around 10% of US Audit Partners After Failed Exit Push
French Police Probe Suspected Weather-Data Tampering After Unusual Polymarket Bets on Paris Temperatures
News Roundup
Microsoft lost 2.5 millions users (French government) to Linux
Privacy Problems in Microsoft Windows OS
News roundup
Péter András Magyar and the Strategic Reset of Hungary
Hungary After the Landslide — A Strategic Reset in Europe
×