New features will allow parents to manage teens' app usage during specific times, addressing mounting pressure for social media regulation.
TikTok is expanding its parental control features with the launch of new tools designed to enhance parental monitoring capabilities.
These updates will enable parents to restrict their teenagers' access to the app during specific times, such as during family meals, school hours, nighttime, or holidays.
Previously, TikTok and similar social media platforms allowed for basic screen time allowances but did not offer the granularity of scheduling restrictions within the day.
The initiative comes amid increasing worries from parents about the impacts of social media on youth mental health and well-being.
Child advocacy groups and various regulatory bodies have been urging for stricter laws governing social media usage, stressing the need for platforms to implement more robust safety features.
In addition to usage management, TikTok is introducing functionalities that will provide parents with insights into their children's social interactions on the platform.
Parents will be able to see the accounts their child follows, who is following them, and which accounts have been blocked.
Furthermore, the platform is implementing a meditation tool, dubbed the Wind Down feature, which will be offered to users under the age of 16. This tool will be presented in users’ feeds after 10 PM as a prompt to encourage healthy sleep habits by guiding them away from screen time.
The Wind Down tool is designed to activate automatically when users are active on TikTok past the established bedtime.
It will interrupt their usual feed with calming music and visuals.
Should the user continue to engage with the content, a more prominent notification will appear, asking them to reconsider their usage at that hour.
Val Richey, TikTok's global head of outreach and partnerships, trust, and safety, noted that many teens in testing regions have opted to keep these reminders active.
Parental control features have become increasingly common among social media platforms in recent years, yet research indicates that the uptake of these controls remains low.
The exact percentage of TikTok users utilizing its Family Pairing features has not been disclosed.
Critics have raised concerns regarding the responsibility for children's safety being predominantly placed on parents rather than the platforms themselves.
Dame Melanie Dawes, Chief Executive of Ofcom, the UK’s communications regulator, highlighted the potential discrepancy in relying solely on parental controls, emphasizing that social media companies must also fulfill their responsibilities.
Furthermore, Ofcom's findings reveal an alarming trend of children under the legal age of 13 accessing such platforms.
At a TikTok event, child psychologist Kirren Schnack suggested that 30 minutes of daily use is a suitable limit for 13-year-olds, contingent on their other daily responsibilities.
Andy Burrows, Chief Executive of the Molly Rose Foundation, expressed cautious optimism about stronger parental controls, while also raising concerns about the type of content that minors are exposed to on the platform, stating that regulatory measures need to be more effective and enforceable.
Recent reports indicate that some young users have encountered harmful content related to self-harm, eating disorders, and suicidal ideation on TikTok, underscoring the urgency for enhanced safety protocols.
A November 2023 analysis by the Molly Rose Foundation confirmed that nearly half of the content reviewed across popular suicide and self-harm hashtags on TikTok and Instagram was deemed potentially harmful, raising questions about the adequacy of current measures to protect young users.