New features restrict under-16 accounts by default, enhancing safety on Meta platforms.
Meta Platforms Inc. has announced an expansion of its Teen Accounts system on Instagram and is extending similar features to
Facebook and
Facebook Messenger.
This initiative aims to enhance parental controls and safety measures for users under the age of 16. As of April 2025, accounts of under-16 users will now have heavily restricted settings by default, requiring parental permission for certain features to be modified.
Among the new restrictions, Instagram users under the age of 16 will not be permitted to use the live streaming feature without explicit parental consent.
Additionally, these users will need parental approval to disable an automatic feature that blurs images flagged as potentially containing nudity in direct messages.
Since the introduction of Teen Accounts in September 2024, approximately 54 million teenagers globally have been transitioned to this account type.
By default, Teen Accounts are set to private, which limits messaging capabilities and places users in the most stringent category of Meta's sensitive content settings.
Meta states that these updates reflect its commitment to creating a safer environment for younger users by addressing parental concerns about online safety.
Meta’s announcement coincides with increasing global regulatory scrutiny of social media platforms, particularly concerning the safety of children online.
In the United Kingdom, the Online Safety Act is being implemented to mandate action from major tech companies to safeguard users, especially minors, from illegal or harmful content.
Furthermore, the platform has faced criticism for its recent policy changes regarding content moderation.
In January 2025, CEO
Mark Zuckerberg revealed a shift away from third-party fact-checking in favor of user-generated community notes, which he argued would enhance free expression.
However, experts and child safety advocates have expressed concerns that this approach may expose children to more harmful content, emphasizing the delicate balance between fostering open communication and ensuring user safety on social media platforms.
The rollout of these enhanced parental controls aims to give parents greater oversight of their children's online interactions, aligning with Meta’s efforts to respond to a growing demand for increased safety measures in digital environments.