Concerns Raised Over Online Safety Act's Protection of Children Against Sexual Abuse
Charities warn that existing loopholes may leave children vulnerable on encrypted messaging platforms.
A coalition of charities has formally expressed grave concerns regarding the Online Safety Act, particularly its provisions that may inadequately protect children from online sexual abuse on private messaging platforms.
In a joint letter addressed to Home Secretary Yvette Cooper and Technology Secretary Peter Kyle, the organizations have highlighted what they describe as an "unacceptable loophole" in legislation governing encrypted messaging services.
The letter draws attention to the current wording of the regulator Ofcom’s codes of practice, which stipulates that platforms must take down illegal content where it is "technically feasible." The charities argue that this phrasing could be leveraged by encrypted messaging services, such as WhatsApp, to evade responsibilities and maintain illegal content, thereby endangering children.
End-to-end encryption technologies prevent third parties from accessing the contents of messages exchanged between users, including the service providers themselves.
The charities stress that this limitation could lead to a failure in managing and reporting child sexual exploitation and abuse (CSEA) material, allowing potentially harmful content to remain in circulation and pose significant risks to children.
In their correspondence, the charities articulated their concerns: "It is important to be clear about the implications of this decision.
Whether it is a child being sent a nude image of themselves as a form of sexual extortion, or an adult being exposed to child sexual abuse material in a group chat, on some sites users will not be able to confidently report and have this content removed."
The signatories of the letter include notable organizations such as the NSPCC (National Society for the Prevention of Cruelty to Children), Barnardo’s, the Marie Collins Foundation, the Lucy Faithfull Foundation, and the Centre of expertise on child sexual abuse.
It is part of a broader discourse on the balance between safety measures and privacy rights in the digital realm, particularly as the UK government has been reported to seek access to encrypted files hosted by tech companies, such as Apple.
Law enforcement and security agencies globally have cited that encryption hinders their ability to combat crime effectively, particularly regarding child exploitation and terrorism.
Conversely, technology firms maintain that user privacy is essential, cautioning that backdoors for government access could be exploited by malicious entities.
The charities urge the government to clarify its commitments to child safety and ensure that private messaging platforms do not become venues for severe online abuse.
This comes in light of recent Home Office statistics revealing over 38,000 recorded crimes related to child sexual abuse images in England and Wales within the last year, translating to more than 100 incidents per day on average.
Furthermore, the Internet Watch Foundation (IWF), an organization dedicated to locating and removing child sexual abuse material online, has voiced similar concerns about the codes of practice.
They criticized the current language as an insufficient safeguard against companies evading their compliance responsibilities concerning online safety laws.
In response to the criticisms, Ofcom has stated that the law requires that measures in their codes of practice must be technically feasible and emphasized that they expect most platforms will have the capability to remove such content, committing to accountability for those that fail to comply.
They also mentioned that platforms will be required to take specific actions to protect children, including the review of reported child sexual abuse material and notification to law enforcement.