Yesterday, the Federal Trade Commission (“FTC”) and the New York Attorney General’s office (“NYAG”) settled allegations against Google LLC and its subsidiary YouTube, LLC claiming violations of the Children’s Online Privacy Protection Act and its implementing rule (together, “COPPA”).  The settlement requires Google and YouTube to pay $136 million to the FTC and $34 million to the NYAG for a total penalty almost 30 times higher than the largest COPPA penalty previously imposed.

Overview of the Complaint and Order

The joint FTC-NYAG complaint alleged that Google and YouTube collected personal information from children under 13 online and used that information to deliver online behavioral advertising, without first providing notice or obtaining verifiable parental consent as required by COPPA.  More specifically, the complaint alleged that Google and YouTube had actual knowledge that certain YouTube channels were child-directed but nevertheless collected persistent identifiers in the form of cookie and advertising identifiers to serve behavioral advertising to viewers of those channels.

In addition to requiring the $170 million total civil penalty and enjoining future COPPA violations, the settlement order requires “fencing-in” relief—which is relief in the form of injunctive provisions that go beyond what is required under existing law.  The order requires that YouTube and Google establish a system on YouTube that requires channel owners to self-designate whether the content they upload is child-directed.  For videos designated as child-directed, YouTube will not collect persistent identifiers for behavioral advertising.  The order further requires that Google and YouTube implement a training program for employees about the system and about COPPA’s requirements overall.  Finally, it imposes compliance reporting and recordkeeping requirements.

The settlement is notable both for what it does—and doesn’t—establish:

What the Settlement Does

Reaffirms the Actual Knowledge Standard.  The allegations against Google and YouTube were premised on Google and YouTube having actual knowledge of specific child-directed content on the YouTube platform.  Under the COPPA statute, sites and services directed to a general audience are not subject to COPPA’s requirements unless and until they gain “actual knowledge” that personal information is collected online from children.  In 2013, the FTC also interpreted the statutory language to expand COPPA’s scope to cover operators of general audience sites and services that have actual knowledge that they collect personal information through other child-directed sites and services.  Here, the FTC found that Google had actual knowledge through:

  • Google and YouTube’s direct communications with companies who uploaded content to YouTube and specifically indicated to Google or YouTube that this content was directed to children
  • Content ratings that Google assigned to specific content on YouTube and which designated certain content as “generally intended for children ages 0-7”
  • Google and YouTube’s curation of specific YouTube content for its separate YouTube Kids app

The FTC also expressed concern that Google and YouTube marketed YouTube to advertisers as a top destination for kids.  For example, in a presentation to certain toy brands Google and YouTube stated that: “YouTube is today’s leader in reaching children age 6-11 against top TV channels”; “YouTube was unanimously voted as the favorite website for kids 2-12″; YouTube is “[t]he new ‘Saturday Morning Cartoons'”; and “YouTube was the “#1 website regularly visited by kids.”  The complaint noted that these statements contradicted statements that Google and YouTube were separately making to content providers that no users under 13 were on YouTube.

Significantly, a separate statement by Chairman Simons and Commissioner Wilson and statements by Andrew Smith, Director of the Bureau of Consumer Protection, during the press briefing announcing the settlement, emphasized that the FTC would bear the burden of establishing in court that Google and YouTube had actual knowledge of the child-directed status of each channel.  This reliance on the actual knowledge standard is consistent with the text of the COPPA statute, which explicitly requires actual knowledge, and long-standing FTC precedent rejecting lower standards (such as constructive knowledge—like the AI-based predictive tool suggested by Commissioner Slaughter—or a reason to know standard) as inappropriate and unworkable.

Puts Pressure On Children’s Content Companies To Put Platforms on Notice.  The complaint alleges that individual channels on the YouTube platform are “websites or online services” under COPPA and that, accordingly, content companies that post child-directed content on YouTube are on notice that the FTC will consider them to be standalone “operators” under COPPA, subject to strict liability for COPPA violations involving data collected from children through those channels.  The FTC warned that it will be conducting a “sweep” of child-directed content on platforms following implementation of the order’s provisions.  This language is likely to motivate children’s content companies to notify their platform partners that their content is child directed and to inquire further about COPPA compliance.

What the Settlement Doesn’t Do

No Legal Obligation for General Audience Platforms To Investigate. The Simons/Wilson separate statement explicitly provides that while Google will be required under the Order’s fencing in provisions to create a system for content providers to self-designate whether the content they post online is child-directed, this step goes beyond COPPA’s legal requirements and no other platform or adtech company is bound by this Order.  The case continues the FTC’s long-held view that platforms and general audience services have no legal obligation to investigate whether third-party content on their platforms is directed to children.

No Requirement to Algorithmically Predict Whether Content Is Child-Directed.  While the order requires Google and YouTube to implement a mechanism for channel owners to identify child-directed content, it stops short of requiring YouTube to implement an algorithmic tool to predict whether content may be child-directed and tag such content itself for child-directed treatment.  Commissioner Slaughter advocated for such a technological backstop as a way of attempting to police channels that may mis-designate their content.  However, as explained by the Chairman Simons/Commissioner Wilson statement and by Director Smith during the press conference, the FTC refrained from imposing such a requirement out of concern that the efficacy of an algorithm could be difficult to enforce, that the use of an algorithm could serve as a shield against enforcement, and that prescribing an algorithm that would keep pace with evolving technology could prove difficult.  In addition (and as noted above), such an approach would impose a constructive knowledge standard on platforms contrary to the text of the COPPA statute, because it would require these algorithms to ferret through circumstantial evidence to assess the channel’s audience and could at best provide a prediction of child-directed status.

 

Print:
Email this postTweet this postLike this postShare this post on LinkedIn
Photo of Lindsey Tonsager Lindsey Tonsager

Lindsey Tonsager co-chairs the firm’s global Data Privacy and Cybersecurity practice. She advises clients in their strategic and proactive engagement with the Federal Trade Commission, the U.S. Congress, the California Privacy Protection Agency, and state attorneys general on proposed changes to data protection…

Lindsey Tonsager co-chairs the firm’s global Data Privacy and Cybersecurity practice. She advises clients in their strategic and proactive engagement with the Federal Trade Commission, the U.S. Congress, the California Privacy Protection Agency, and state attorneys general on proposed changes to data protection laws, and regularly represents clients in responding to investigations and enforcement actions involving their privacy and information security practices.

Lindsey’s practice focuses on helping clients launch new products and services that implicate the laws governing the use of artificial intelligence, data processing for connected devices, biometrics, online advertising, endorsements and testimonials in advertising and social media, the collection of personal information from children and students online, e-mail marketing, disclosures of video viewing information, and new technologies.

Lindsey also assesses privacy and data security risks in complex corporate transactions where personal data is a critical asset or data processing risks are otherwise material. In light of a dynamic regulatory environment where new state, federal, and international data protection laws are always on the horizon and enforcement priorities are shifting, she focuses on designing risk-based, global privacy programs for clients that can keep pace with evolving legal requirements and efficiently leverage the clients’ existing privacy policies and practices. She conducts data protection assessments to benchmark against legal requirements and industry trends and proposes practical risk mitigation measures.