Regulatory Push for Online Age Verification and Adult-Site Access Restrictions
A growing regulatory push to require online age verification—particularly for access to pornography and other age-restricted content—is accelerating in the U.S. and U.K., with policymakers framing it as a child-safety measure and critics warning of privacy and free-speech risks. An FTC commissioner publicly endorsed age verification as a tool to protect children online, pointing to widespread state-level adoption in the U.S. and noting that court outcomes have been mixed, including a U.S. Supreme Court decision upholding a Texas law requiring pornography sites to verify users’ ages.
In the U.K., the Online Safety Act (OSA) is driving direct service changes: Aylo (parent company of Pornhub and other tube sites) said it will restrict access in the United Kingdom rather than implement the OSA’s age-checking approach for all visitors, while allowing continued access for users who have already verified their identity. Aylo argued the framework diverts traffic to unregulated sites and creates privacy risks, while Ofcom countered that services can either implement compliant age checks or block U.K. access and urged development of effective device-level solutions.
Sources
Related Stories

Regulatory Push to Strengthen Child Online Safety and Age Assurance on Social Platforms
UK regulators **Ofcom** and the **Information Commissioner’s Office (ICO)** issued warnings to major social media and video platforms (including **Facebook, Instagram, Snapchat, TikTok, and YouTube**) demanding “urgent steps” to implement more robust **age assurance** controls to prevent access by children under 13. Regulators signaled potential enforcement if platforms continue relying primarily on easily bypassed self-declared ages, arguing this enables unlawful collection and use of children’s data and exposes under-13s to services not designed for them; Ofcom requested companies report back on their plans by the end of April. In the US, the House Energy and Commerce Committee advanced the **Kids Internet and Digital Safety (KIDS) Act** on a party-line vote amid Democratic objections that the bill could reduce platform accountability for harms to minors. Criticisms focused on provisions that could **preempt certain state laws**, a **knowledge requirement** that opponents argue may let companies claim ignorance of minors’ presence, and the lack of a proactive **“duty of care”** requirement; proposed amendments to strengthen protections were not adopted.
5 days agoGovernment Pushes for Age Verification and Content Controls on Digital Platforms
The UK government is urging major technology companies such as Apple and Google to implement nudity-blocking systems on mobile devices, aiming to protect minors by requiring adult users to verify their age before accessing or sharing explicit images. This initiative, which currently stops short of a legal mandate, would leverage nudity-detection algorithms at the operating system level and could be expanded to desktop platforms in the future. The proposed measures would also require child sex offenders to keep such blockers enabled, reflecting a broader governmental effort to enforce age-appropriate content controls across digital ecosystems. Simultaneously, experts are raising concerns about the effectiveness of current age verification technologies, particularly those relying on consumer-grade cameras and AI-powered facial recognition. Research highlights that these systems may provide a false sense of security, as they are susceptible to spoofing and may not reliably authenticate minors. The debate underscores the technical and policy challenges in balancing child safety, privacy, and the practical limitations of available authentication methods on popular platforms like Roblox and other social media or gaming services.
2 months ago
UK Regulators Fine Online Platforms for Failing to Implement Effective Age Assurance
UK regulators issued major penalties against online services for inadequate **age assurance** controls intended to protect children. The Information Commissioner’s Office (**ICO**) fined **Reddit £14.47 million** for unlawfully processing children’s data, alleging that despite a stated under-13 prohibition, Reddit did not introduce an age assurance mechanism until **July 2025** and had not completed a required **data protection impact assessment (DPIA)** before **January 2025**. The ICO said these failures potentially exposed minors to inappropriate content and left under-13 users’ personal data collected and used without a lawful basis; Reddit said it intends to appeal. Separately, communications regulator **Ofcom** fined porn operator **8579 LLC £1.35 million** under the UK **Online Safety Act** for failing to deploy “highly effective” age checks (e.g., photo ID matching or credit card checks) to prevent minors from accessing adult content. Ofcom also imposed an additional **£50,000** penalty for allegedly ignoring information requests and warned of an ongoing **£1,000/day** penalty until compliant age verification is implemented, amid broader concerns from civil liberties groups about the privacy and cybersecurity risks of stringent age-verification regimes.
2 weeks ago