UK Government Seeks Smartphone Age Verification to Block Explicit Content for Children
London urges Apple and Google to build nudity-detection and age-check tools into devices to protect minors from harmful imagery
The United Kingdom government is advancing plans to encourage major technology companies to embed age-verification systems and nudity-detection software directly into smartphone operating systems, requiring users to confirm their age before capturing, sharing or viewing explicit imagery.
Officials say the measure is part of a broader initiative to safeguard children from pornography, exploitation and online harm, and will complement existing age-verification requirements under the Online Safety Act.
Under the proposal, Apple’s iOS and Google’s Android platforms could incorporate algorithms capable of detecting nude or sexually explicit images at the system level and block access unless the user verifies they are an adult — potentially through biometric checks or official identity documents.
Ministers have explored but opted against making such software mandatory for all devices sold in the UK at this stage, choosing instead to “encourage” voluntary adoption by platform holders as part of the Home Office strategy to tackle violence against women and girls and reduce children’s exposure to harmful content.
This approach mirrors similar efforts in Australia, where tech firms have been urged to build nudity-filtering and reporting tools into devices.
The plan reflects the government’s ongoing efforts to enforce age checks online following the Online Safety Act’s implementation earlier this year, which obliges websites and services with UK users to prevent minors from accessing harmful material such as pornography and self-harm content by verifying users’ ages.
Recent enforcement by the UK regulator Ofcom has already led platforms including Reddit, X, Bluesky and others to introduce age-verification systems that may involve identity documents or facial scanning technology.
Despite these moves, officials and child safety advocates argue that current content controls are not comprehensive enough without integration at the device level.
However, the proposal has sparked debate about privacy, civil liberties and technological feasibility.
Critics caution that system-level scanning could raise concerns over data protection and user autonomy, particularly if age verification requires sensitive personal information.
There are also questions about the accuracy of automated nudity detection and the potential for false positives.
Technology companies have not formally responded to the latest UK proposals, but past discussions in jurisdictions such as the United States have seen firms like Apple resist compulsory age verification mandates on the grounds of user privacy and device security.
Government sources indicate that while the age-check initiative is not yet compulsory, it forms part of a wider online safety strategy that includes monitoring emerging technologies and collaborating with industry to enhance children’s protection.
The debate is unfolding against a backdrop of broader age verification discussions in Europe, where policymakers are also examining social media age limits and digital safeguards for minors.