Skip to main content
Mallory
Mallory

Disney Fined for COPPA Violations on YouTube

COPPADisneychildren's privacyFTCYouTubecontent providersadvertising practicescontent creatorsparental consentFederal Trade Commissiononline safetyprivacyDepartment of Justicevideo labelingtargeted advertisements
Updated January 6, 2026 at 02:07 PM2 sources
Disney Fined for COPPA Violations on YouTube

Get Ahead of Threats Like This

Know if you're exposed — before adversaries strike.

Disney has agreed to pay a $10 million settlement following allegations that it violated the Children’s Online Privacy Protection Act (COPPA) by failing to properly label thousands of its YouTube videos as directed at children. This mislabeling allowed Disney and its partners to collect personal data from children under 13 and serve them targeted advertisements without obtaining parental consent, actions that are explicitly prohibited under COPPA. The Federal Trade Commission (FTC) initially investigated the case before referring it to the Department of Justice (DoJ), which announced the settlement and emphasized the importance of protecting children’s privacy online.

The settlement highlights the ongoing regulatory scrutiny of large content providers on platforms like YouTube, especially regarding compliance with child privacy laws. YouTube had previously updated its policies to require content creators to label videos as "made for kids" or not, following its own record $170 million COPPA settlement in 2019. The Disney case underscores the legal and financial risks for companies that fail to adhere to these requirements, reinforcing the government’s commitment to enforcing parental rights and safeguarding children’s data online.

Related Entities

Related Stories

Disney Settlement Over California Consumer Privacy Act Opt-Out Failures

Disney Settlement Over California Consumer Privacy Act Opt-Out Failures

Disney agreed to pay **$2.75 million** to settle allegations by the California Attorney General that it violated the **California Consumer Privacy Act (CCPA)** by making it difficult for consumers to opt out of the sale/sharing of their personal data. California alleged Disney’s opt-out mechanisms contained gaps that prevented users—including those logged into their accounts—from fully stopping data sharing across Disney’s services, devices, and platforms, and that data continued to be shared with **third-party ad-tech companies** whose code was embedded in Disney websites and apps. The settlement (pending court approval) requires Disney to implement a more comprehensive privacy program and provide California officials a **compliance update within 60 days** describing changes made to align with CCPA requirements. State officials characterized the penalty as the **largest fine to date under the CCPA**; Disney did not admit liability as part of the agreement and said it continues to invest in privacy protections across its streaming services.

1 months ago

Florida Lawsuit Against Roku for Alleged Sale of Children's Personal Data

Florida Attorney General James Uthmeier has initiated legal action against Roku, a leading smart TV company, alleging that the firm collected and sold sensitive personal data belonging to children without proper notice or parental consent. The lawsuit, filed in Collier County Circuit Court, accuses Roku of violating Florida's Deceptive and Unfair Trade Practices Act and the state's Digital Bill of Rights. According to the complaint, Roku gathered a range of sensitive information, including children's online activity, viewing histories, location data, and voice recordings. The data was allegedly sold to third-party data brokers such as Kochava, which is itself under federal scrutiny for its handling of geolocation data, as well as to advertisers. The Attorney General's office claims that Roku failed to implement industry-standard user profiles or age verification mechanisms, despite knowing that a significant portion of its users are children. The complaint highlights that Roku's technology is present in about half of American households, reaching approximately 145 million people as of 2024, amplifying the potential scale of the alleged privacy violations. The state asserts that Roku continued to process and sell children's data even when users signaled their status as children by subscribing to kids-oriented programming and features. Furthermore, the lawsuit alleges that Roku misled consumers about the effectiveness of its privacy controls and opt-out tools, giving users a false sense of security regarding their data privacy. The Attorney General's Office of Parental Rights is seeking civil penalties, injunctive relief, and the implementation of stronger disclosure and parental-control mechanisms. The complaint also notes that four of the five most searched programs on Roku in 2024 were children's entertainment, yet the company did not take adequate steps to determine if users whose data was being sold were minors. The legal action underscores growing regulatory scrutiny over the handling of children's data by technology companies. Roku has not yet publicly responded to the allegations. The case brings attention to the broader issue of data privacy for minors in the digital age, especially as connected devices become more prevalent in households. The involvement of data brokers like Kochava, already facing federal action, adds another layer of complexity to the case. The outcome of this lawsuit could set important precedents for how companies collect, process, and monetize children's data in the future. The case also highlights the challenges regulators face in enforcing privacy protections in rapidly evolving technology ecosystems. If successful, the lawsuit may prompt other states to pursue similar actions against technology firms. The proceedings will be closely watched by privacy advocates, industry stakeholders, and policymakers concerned with children's online safety.

5 months ago
FTC Policy Statement on COPPA Exemption for Age Verification Data Collection

FTC Policy Statement on COPPA Exemption for Age Verification Data Collection

The **U.S. Federal Trade Commission (FTC)** issued a policy statement clarifying it will **not pursue COPPA enforcement** against websites and online services that collect, use, or share personal data *solely* to perform **age verification**, addressing industry concerns that age-checking could itself trigger COPPA liability. The FTC said the exemption applies only when providers give clear notice to parents/children, limit use of the data to confirming age, avoid retaining the information after verification, and share it only with third parties they are confident will maintain confidentiality; the agency also emphasized the need to **employ reasonable security safeguards** and take reasonable steps to ensure age-verification methods and vendors provide **reasonably accurate** results. The FTC said it plans to **review the COPPA Rule** to further address age verification, following earlier agency remarks framing age verification as an important child-protection tool. Separately, **Discord** announced it is **postponing and modifying** a planned global age-verification policy after user backlash, delaying rollout to the second half of 2026 and adding options beyond government ID or video selfies (e.g., credit card verification), along with commitments to vendor transparency and a forthcoming technical explanation of its “age determination systems”; this reflects broader regulatory pressure for platforms to verify user ages but is distinct from the FTC’s COPPA enforcement posture.

2 weeks ago

Get Ahead of Threats Like This

Mallory continuously monitors global threat intelligence and correlates it with your attack surface. Know if you're exposed — before adversaries strike.