Skip to main content
Mallory
Mallory

UK Considers Social Media Ban and Stronger Age Assurance for Children

under-16 social media restrictionssocial media bandigital age of consentUK governmentChildren’s Wellbeing and Schools Billunder-16sphone curfewsage-based adviceage assurancescreen-time guidanceaccount deactivationsschool enforcementAustraliachief medical officers
Updated January 22, 2026 at 07:00 PM3 sources
UK Considers Social Media Ban and Stronger Age Assurance for Children

Get Ahead of Threats Like This

Know if you're exposed — before adversaries strike.

The UK government said it is considering restricting or banning social media access for children, with Prime Minister Keir Starmer stating that “no option is off the table.” Reported options include improving age assurance technology, raising the digital age of consent, imposing phone curfews, and limiting platform design practices associated with compulsive use (including “infinite scrolling”), alongside publishing evidence-based screen-time guidance for parents and tightening school enforcement around phone use.

In Parliament, momentum is being driven in part by a proposed amendment to the Children’s Wellbeing and Schools Bill that would require regulated user-to-user services to deploy “highly-effective” age assurance measures to prevent under-16s from becoming users, and would also task the UK’s chief medical officers with publishing age-based advice on children’s social media use. UK officials also indicated they plan to engage with Australia to learn from its under-16 social media restrictions, which Australian authorities said led to millions of account deactivations shortly after implementation.

Related Stories

Regulatory Push to Strengthen Child Online Safety and Age Assurance on Social Platforms

Regulatory Push to Strengthen Child Online Safety and Age Assurance on Social Platforms

UK regulators **Ofcom** and the **Information Commissioner’s Office (ICO)** issued warnings to major social media and video platforms (including **Facebook, Instagram, Snapchat, TikTok, and YouTube**) demanding “urgent steps” to implement more robust **age assurance** controls to prevent access by children under 13. Regulators signaled potential enforcement if platforms continue relying primarily on easily bypassed self-declared ages, arguing this enables unlawful collection and use of children’s data and exposes under-13s to services not designed for them; Ofcom requested companies report back on their plans by the end of April. In the US, the House Energy and Commerce Committee advanced the **Kids Internet and Digital Safety (KIDS) Act** on a party-line vote amid Democratic objections that the bill could reduce platform accountability for harms to minors. Criticisms focused on provisions that could **preempt certain state laws**, a **knowledge requirement** that opponents argue may let companies claim ignorance of minors’ presence, and the lack of a proactive **“duty of care”** requirement; proposed amendments to strengthen protections were not adopted.

5 days ago
European Governments Move to Restrict Social Media Use by Minors

European Governments Move to Restrict Social Media Use by Minors

The **European Commission** issued preliminary findings that **TikTok’s product design**—including *infinite scroll*, *autoplay*, *push notifications*, and *personalized recommendations*—may breach the EU **Digital Services Act (DSA)** by failing to adequately assess and mitigate risks to users’ physical and mental well-being, particularly for **minors and vulnerable users**. If confirmed, the Commission said the violations could result in penalties of up to **6% of TikTok’s global annual turnover**, and it signaled expected design changes such as **screen-time breaks**, adjustments to recommendation systems, and disabling or reducing features deemed to drive compulsive use. Separately, **Spain** announced plans to **ban social media access for children under 16** and require **age verification** by platforms, aligning with a broader European trend toward statutory restrictions on minors’ social media use. The announcement follows similar initiatives across Europe, including Australia’s under-16 restriction (cited as precedent), the Netherlands’ push to bar under-15s, French legislation targeting under-14s, and the UK studying a ban for children 15 and under—indicating accelerating regulatory pressure on platforms to implement enforceable child-safety and access controls.

1 months ago

Government Pushes for Age Verification and Content Controls on Digital Platforms

The UK government is urging major technology companies such as Apple and Google to implement nudity-blocking systems on mobile devices, aiming to protect minors by requiring adult users to verify their age before accessing or sharing explicit images. This initiative, which currently stops short of a legal mandate, would leverage nudity-detection algorithms at the operating system level and could be expanded to desktop platforms in the future. The proposed measures would also require child sex offenders to keep such blockers enabled, reflecting a broader governmental effort to enforce age-appropriate content controls across digital ecosystems. Simultaneously, experts are raising concerns about the effectiveness of current age verification technologies, particularly those relying on consumer-grade cameras and AI-powered facial recognition. Research highlights that these systems may provide a false sense of security, as they are susceptible to spoofing and may not reliably authenticate minors. The debate underscores the technical and policy challenges in balancing child safety, privacy, and the practical limitations of available authentication methods on popular platforms like Roblox and other social media or gaming services.

2 months ago

Get Ahead of Threats Like This

Mallory continuously monitors global threat intelligence and correlates it with your attack surface. Know if you're exposed — before adversaries strike.