Skip to main content
Mallory
Mallory

Government Pushes for Age Verification and Content Controls on Digital Platforms

age verificationdigital contentcontent controlsprivacy concernsdigital ecosystemssocial mediapolicy challengesage-appropriategovernment initiativenudity-blockinggaming serviceschild sex offenderschild safetyminors protectiontechnology companies
Updated December 18, 2025 at 10:01 AM2 sources

Get Ahead of Threats Like This

Know if you're exposed — before adversaries strike.

The UK government is urging major technology companies such as Apple and Google to implement nudity-blocking systems on mobile devices, aiming to protect minors by requiring adult users to verify their age before accessing or sharing explicit images. This initiative, which currently stops short of a legal mandate, would leverage nudity-detection algorithms at the operating system level and could be expanded to desktop platforms in the future. The proposed measures would also require child sex offenders to keep such blockers enabled, reflecting a broader governmental effort to enforce age-appropriate content controls across digital ecosystems.

Simultaneously, experts are raising concerns about the effectiveness of current age verification technologies, particularly those relying on consumer-grade cameras and AI-powered facial recognition. Research highlights that these systems may provide a false sense of security, as they are susceptible to spoofing and may not reliably authenticate minors. The debate underscores the technical and policy challenges in balancing child safety, privacy, and the practical limitations of available authentication methods on popular platforms like Roblox and other social media or gaming services.

Related Stories

Regulatory Push for Online Age Verification and Adult-Site Access Restrictions

Regulatory Push for Online Age Verification and Adult-Site Access Restrictions

A growing regulatory push to require **online age verification**—particularly for access to pornography and other age-restricted content—is accelerating in the U.S. and U.K., with policymakers framing it as a child-safety measure and critics warning of privacy and free-speech risks. An **FTC** commissioner publicly endorsed age verification as a tool to protect children online, pointing to widespread state-level adoption in the U.S. and noting that court outcomes have been mixed, including a **U.S. Supreme Court** decision upholding a Texas law requiring pornography sites to verify users’ ages. In the U.K., the **Online Safety Act (OSA)** is driving direct service changes: **Aylo** (parent company of Pornhub and other tube sites) said it will **restrict access in the United Kingdom** rather than implement the OSA’s age-checking approach for all visitors, while allowing continued access for users who have already verified their identity. Aylo argued the framework diverts traffic to unregulated sites and creates privacy risks, while **Ofcom** countered that services can either implement compliant age checks or block U.K. access and urged development of effective device-level solutions.

1 months ago
Regulatory Push to Strengthen Child Online Safety and Age Assurance on Social Platforms

Regulatory Push to Strengthen Child Online Safety and Age Assurance on Social Platforms

UK regulators **Ofcom** and the **Information Commissioner’s Office (ICO)** issued warnings to major social media and video platforms (including **Facebook, Instagram, Snapchat, TikTok, and YouTube**) demanding “urgent steps” to implement more robust **age assurance** controls to prevent access by children under 13. Regulators signaled potential enforcement if platforms continue relying primarily on easily bypassed self-declared ages, arguing this enables unlawful collection and use of children’s data and exposes under-13s to services not designed for them; Ofcom requested companies report back on their plans by the end of April. In the US, the House Energy and Commerce Committee advanced the **Kids Internet and Digital Safety (KIDS) Act** on a party-line vote amid Democratic objections that the bill could reduce platform accountability for harms to minors. Criticisms focused on provisions that could **preempt certain state laws**, a **knowledge requirement** that opponents argue may let companies claim ignorance of minors’ presence, and the lack of a proactive **“duty of care”** requirement; proposed amendments to strengthen protections were not adopted.

5 days ago
Debate Over Kids Online Safety Act and Age-Verification Requirements for Minors

Debate Over Kids Online Safety Act and Age-Verification Requirements for Minors

Policymakers in multiple jurisdictions are advancing **child online safety** rules that would restrict minors’ access to social media, “addictive” product features, and certain content (including pornography), increasing pressure on platforms to implement **age assurance/age verification** to determine users’ ages before allowing access. The Lawfare analysis highlights that while protecting children online is a widely shared goal, enforcing age-based restrictions at scale effectively requires collecting and validating age signals for *all* users—raising significant implementation, privacy, and governance challenges as governments consider measures such as the **Kids Online Safety Act (KOSA)**, the **Kids Off Social Media Act**, and the **App Store Accountability Act**.

1 months ago

Get Ahead of Threats Like This

Mallory continuously monitors global threat intelligence and correlates it with your attack surface. Know if you're exposed — before adversaries strike.