Europe
2025.12.11 21:55 GMT+8

Under-16 social media bans to spread as nations set new age limits

Updated 2025.12.11 21:55 GMT+8
CGTN

A teenager poses holding a mobile phone displaying a message from TikTok as law banning social media for users under 16 in Australia takes effect, in Sydney, Australia. /Hollie Adams/Reuters

Australia has rolled out the world's first nationwide ban on social media for children under 16 and other countries are monitoring the results carefully, with a view to potentially replicating the move.

Effective from December 10, the law forces platforms including TikTok, Instagram, YouTube, Facebook, Snapchat, Reddit, X, Twitch, Kick and Threads to delete underage accounts and block new ones. Non-compliance carries fines of up to A$49.5 million ($32.8 million). 

Age verification will involve facial scans, ID uploads or activity inference, though privacy concerns and teen workarounds, such as VPNs, are already surfacing.

Reactions have been mixed in Australia. Parents and child advocates, including Communications Minister Michelle Rowland, hail it as a stand for families over tech giants. Meta has criticised the ban as an ineffective overreach that ignores better platform design, while Amnesty Tech warns it could drive children to unregulated spaces. 

Prime Minister Anthony Albanese acknowledged rollout challenges but insisted it would "ultimately save lives."

Australia's step is influencing others. UNICEF cautions that age bans alone won't suffice, urging redesigns for safer platforms and digital literacy for parents. Many countries already limit access through consent, time caps or content blocks. Platforms like TikTok and Meta self-impose a 13-year-old minimum, but European data show widespread under-13 accounts.

What happens in other countries?

China maintains the strictest system. Foreign apps are blocked outright. Domestic apps like Douyin cap under-14s at 40 minutes daily usage with a 22:00–06:00 blackout; 14- to 17-year-olds get two hours. All devices must feature a "minor mode" for age-based restrictions, including gaming limits for under-18s.

In the European Union, the Digital Services Act sets a 13-year-old baseline, but member states vary. A November 2025 resolution urges a harmonised 16-year-old limit, with a 13-year-old limit for video sharing and AI tools. 

Denmark plans an under-15 ban with parental opt-ins for 13- and 14-year-olds. 

France's 2023 law mandates consent for under-15s but faces enforcement delays; a September 2025 inquiry pushed for an under-15 ban and "digital curfew" for 15 to 18-year-olds, plus device bans for under-11s. 

Germany requires consent for 13 to 16-year-olds, though checks are lax – 77 percent of adults favour an under-16 ban. 

Spain's draft raises the age to 16 with opt-ins; Italy needs consent under 14 and Norway is aiming for a ban for those aged under 15, with an absolute cap in the works.

The United Kingdom does not impose a blanket minimum age for social media, but its Online Safety Act passed in 2023 and fully enforced from 2025 requires platforms to use age verification to protect children from harmful content. The law covers everything from self-harm material to cyberbullying and pornography. In December 2025, regulator Ofcom issued its first major penalty under the act, fining an adult site £13 million for failing to block underage users.

Despite these measures, young children remain active online. A 2025 study by the children's charity NSPCC found that one in ten three-year-olds in the UK already has a social media profile, often set up by parents or older siblings.

In the United States, federal law, the Children's Online Privacy Protection Act prohibits platforms from collecting personal data from children under 13 without parental consent. This rule is the reason most global services, including TikTok and Instagram, set 13 as their official minimum age. 

States like California require broader consent for minors; the federal Kids Off Social Media Act, reintroduced in January 2025, seeks an under-16 ban but faces free-speech challenges. The Kids Online Safety Act pushes platforms to curb addiction and harm via "duty of care."

Elsewhere, Malaysia will ban under-16s from 2026, requiring eKYC verification. New Zealand's May 2025 bill mirrors Australia's, with fines up to NZ$2 million ($1.2 million). Proposals advance in Ireland, Singapore and Kenya, where guidelines demand verification to block harms. In India, a 2025 petition for under-16 bans awaits government action.

The trend, accelerated by 2025's wave of legislation, stems from evidence linking social media to anxiety, cyberbullying and sleep issues in youth. As Australia's rollout unfolds with an estimated one million users affected, governments worldwide are treating it as a test case for balancing safety, privacy and access.

Copyright © 

RELATED STORIES