Beautiful Virgin Islands


TikTok acts on teen safety with ‘bedtime’ block on app alerts

TikTok acts on teen safety with ‘bedtime’ block on app alerts

Social networking firm introduces range of child safety measures including increased privacy controls
TikTok will prevent teenagers from receiving notifications past their bedtime, the company said, announcing a range of child safety improvements that will arrive just before the UK introduces its age appropriate design code next month.

The company will no longer send push notifications after 9pm to users aged between 13 and 15. For 16-year-olds and those aged 17 notifications will not be sent after 10pm.

“We want to help our younger teens in particular develop positive digital habits early on,” said Alexandra Evans, the company’s head of child safety public policy.

The changes may draw comparisons with the restrictions introduced by the tech company Tencent, which last month began blocking Chinese children from playing a number of its hit games after 10pm.

But TikTok has not followed Tencent’s controversial decision to use facial analysis technology designed to ensure children do not pretend to be adults while using the app.

TikTok’s bedtime enforcement is accompanied by other changes applying to younger users. People aged 16 and 17 will now have direct messages disabled by default, while those under 16 will continue to have no access to them at all. And all users under 16 will now be prompted to choose who can see their videos the first time they post them, ensuring they do not accidentally broadcast to a wider audience than intended.

The NSPCC welcomes the changes. Andy Burrows, the charity’s head of child safety online policy, said: “These increased privacy measures will give children more control over who can contact them and view their content, reducing opportunities for offenders to groom them.

“TikTok continue to show industry leadership when it comes to protecting children and we urge those tech firms who have been slow to catch up to be similarly proactive.

“However, the raft of safety announcements we have seen in recent weeks have been driven by the age appropriate design code coming into force next month and shows the positive impact regulation has on children’s safety.”

The code, introduced by the UK’s information commissioner, lays out principles that UK sites and services are expected to follow if they cater for children.

Last month, Instagram introduced updates to align with the code, blocking adults from interacting with children if they show “potentially suspicious behaviour”, as well as making under-16s accounts private by default, and restricting how advertisers can target teenagers.

On Tuesday, Google followed suit, making videos by young people uploaded to YouTube private by default, adding break and bedtime reminders, and turning off autoplay features for children.
Newsletter

Related Articles

Beautiful Virgin Islands
0:00
0:00
Close
×