Regulatory Push to Strengthen Child Online Safety and Age Assurance on Social Platforms
UK regulators Ofcom and the Information Commissioner’s Office (ICO) issued warnings to major social media and video platforms (including Facebook, Instagram, Snapchat, TikTok, and YouTube) demanding “urgent steps” to implement more robust age assurance controls to prevent access by children under 13. Regulators signaled potential enforcement if platforms continue relying primarily on easily bypassed self-declared ages, arguing this enables unlawful collection and use of children’s data and exposes under-13s to services not designed for them; Ofcom requested companies report back on their plans by the end of April.
In the US, the House Energy and Commerce Committee advanced the Kids Internet and Digital Safety (KIDS) Act on a party-line vote amid Democratic objections that the bill could reduce platform accountability for harms to minors. Criticisms focused on provisions that could preempt certain state laws, a knowledge requirement that opponents argue may let companies claim ignorance of minors’ presence, and the lack of a proactive “duty of care” requirement; proposed amendments to strengthen protections were not adopted.
Related Entities
Organizations
Sources
Related Stories

Debate Over Kids Online Safety Act and Age-Verification Requirements for Minors
Policymakers in multiple jurisdictions are advancing **child online safety** rules that would restrict minors’ access to social media, “addictive” product features, and certain content (including pornography), increasing pressure on platforms to implement **age assurance/age verification** to determine users’ ages before allowing access. The Lawfare analysis highlights that while protecting children online is a widely shared goal, enforcing age-based restrictions at scale effectively requires collecting and validating age signals for *all* users—raising significant implementation, privacy, and governance challenges as governments consider measures such as the **Kids Online Safety Act (KOSA)**, the **Kids Off Social Media Act**, and the **App Store Accountability Act**.
1 months ago
UK Considers Social Media Ban and Stronger Age Assurance for Children
The UK government said it is considering restricting or banning social media access for children, with Prime Minister **Keir Starmer** stating that “no option is off the table.” Reported options include improving **age assurance** technology, raising the digital age of consent, imposing phone curfews, and limiting platform design practices associated with compulsive use (including “infinite scrolling”), alongside publishing evidence-based screen-time guidance for parents and tightening school enforcement around phone use. In Parliament, momentum is being driven in part by a proposed amendment to the **Children’s Wellbeing and Schools Bill** that would require regulated user-to-user services to deploy “highly-effective” age assurance measures to prevent under-16s from becoming users, and would also task the UK’s chief medical officers with publishing age-based advice on children’s social media use. UK officials also indicated they plan to engage with Australia to learn from its under-16 social media restrictions, which Australian authorities said led to millions of account deactivations shortly after implementation.
1 months ago
Regulatory Push for Online Age Verification and Adult-Site Access Restrictions
A growing regulatory push to require **online age verification**—particularly for access to pornography and other age-restricted content—is accelerating in the U.S. and U.K., with policymakers framing it as a child-safety measure and critics warning of privacy and free-speech risks. An **FTC** commissioner publicly endorsed age verification as a tool to protect children online, pointing to widespread state-level adoption in the U.S. and noting that court outcomes have been mixed, including a **U.S. Supreme Court** decision upholding a Texas law requiring pornography sites to verify users’ ages. In the U.K., the **Online Safety Act (OSA)** is driving direct service changes: **Aylo** (parent company of Pornhub and other tube sites) said it will **restrict access in the United Kingdom** rather than implement the OSA’s age-checking approach for all visitors, while allowing continued access for users who have already verified their identity. Aylo argued the framework diverts traffic to unregulated sites and creates privacy risks, while **Ofcom** countered that services can either implement compliant age checks or block U.K. access and urged development of effective device-level solutions.
1 months ago