The FTC recently updated Complying with COPPA: Frequently Asked Questions, the set of FAQs meant to provide informal guidance for complying with the Children’s Online Privacy Protection Act and the Commission-issued COPPA Rule.  In an accompanying blog post, the FTC staff emphasized that the revisions to the FAQs “don’t raise new policy issues” and that they were implemented primarily to streamline and reorganize the content “to make the document easier to use.”  While the new FAQs generally only reinforce concepts from recent key settlements, enforcement policy positions, and separately-issued regulatory guidance, some of the updates also provide helpful additional context around specific issues such as mixed audience sites and services, age gates, and common consent mechanisms.

In the blog post, the FTC staff also acknowledged that the ongoing “COPPA Rule review continues,” but they did not provide a timeline for completion of the review.

  • “Directed to Children” Factors. In the wake of the record-breaking COPPA settlement against Google/YouTube, the FTC issued informal guidance intended to assist content creators in determining whether their content could be considered subject to COPPA.  FAQ D.3 incorporates much of this previous guidance and reinforces the notion that the “directed to children” factors (which help determine whether COPPA applies) should be considered on a holistic and case-by-case basis.  FAQ D.3 further reiterates that a site or service “will not be considered ‘directed to children’ just because some children visit.”  Similarly, FAQ D.4 now expressly states, “the ‘mixed audience’ category is a subset of the ‘directed to children’ category, and a general audience site does not become ‘mixed audience’ just because some children use the site or service.”  And FAQ F.5 further clarifies:

 “[A] website or online service that is appealing to all ages and not specifically directed at children is not deemed ‘mixed audience’ simply because some children may use the site or service.  In determining whether your site or service is mixed audience, you should consider your intended audience (are you marketing to under 13 users, such as through selling related toys, for example).  You should also determine whether your site or service involves child-oriented activities, such as a dress up game, and whether you have empirical evidence as to the actual users of your video game site.”

This language may be particularly helpful for operators of sites and services that may attract both adults and children, but that are not otherwise targeted to children.  With regard to the use of bright colors and animated characters, specifically, “the FTC recognizes that some animated characters are directed to a general audience.”

  • Age Gates. The FTC noted that mixed audience operators may not ask a question that a child is unlikely to be able to answer, such as a math problem, as a COPPA age gate.
  • Definition of “Personal Information”
    • Uploads by Adults. COPPA regulates only personal information collected “from” children.  The additions to FAQ F.4 clarify that this rule extends to photos uploaded by an adult in a non-child-directed portion of an otherwise child-directed site or service, as well as photos uploaded by an authenticated 13+ user on a mixed audience site or service.
    • Push Notifications. A deletion to FAQ J.9 suggests that information collected from a child’s device that is used to send push notifications within an app – and not just outside the app – fall within the definition of “online contact information” because it permits the contacting of the user.
  • Parental Notice and Consent
    • Privacy Policies on Apps. Revisions to FAQs C.6 and C.9 indicate that an app operator must post a link to its privacy policy “on the home page of the app.”
    • Non-Enumerated Methods of Verifiable Parental Consent. As previously approved by the FTC, FAQ I.4 now endorses the use of knowledge-based questions and facial recognition to verify a parent’s photo ID as acceptable methods for obtaining verifiable parental consent (VPC).
    • Common Consent Mechanisms. FAQ I.10 reiterates that “common consent mechanisms, such as one done through an app store or other platform” are acceptable forms of obtaining VPC.  It also adds a clarification that operators relying on a common VPC method remain responsible for ensuring that the direct notice they provide to parents “accurate and completely” reflects their information collection practices.  Conversely, FAQ I.14 states that an app store is permitted to provide a VPC mechanism for developers that operate on its platform, and it will not be considered an “operator” under COPPA “[t]o the extent [it is] simply providing a verifiable parental consent mechanism.”
    • Support for Internal Operations. Consistent with the Statement of Basis and Purpose for the COPPA Rule, an addition to FAQ J.5 expressly re-iterates that the “support for internal operations” exception to parental notice and consent includes “intellectual property protection, payment and delivery functions, spam protection, optimization, statistical reporting, and debugging.”
  • Schools. FAQ K.1 now expressly reiterates that, notwithstanding a school’s ability to provide parental consent under COPPA on a parent’s behalf, operators of an educational site or service “should not state in Terms of Service or anywhere else that the school is responsible for complying with COPPA, as it is the responsibility of the operator to comply with the Rule.”
  • Enforcement. FAQ A.6 adds language directing individual consumer complaints to State Attorneys General, who are empowered to enforce COPPA.
Print:
Email this postTweet this postLike this postShare this post on LinkedIn
Photo of Lindsey Tonsager Lindsey Tonsager

Lindsey Tonsager co-chairs the firm’s global Data Privacy and Cybersecurity practice. She advises clients in their strategic and proactive engagement with the Federal Trade Commission, the U.S. Congress, the California Privacy Protection Agency, and state attorneys general on proposed changes to data protection…

Lindsey Tonsager co-chairs the firm’s global Data Privacy and Cybersecurity practice. She advises clients in their strategic and proactive engagement with the Federal Trade Commission, the U.S. Congress, the California Privacy Protection Agency, and state attorneys general on proposed changes to data protection laws, and regularly represents clients in responding to investigations and enforcement actions involving their privacy and information security practices.

Lindsey’s practice focuses on helping clients launch new products and services that implicate the laws governing the use of artificial intelligence, data processing for connected devices, biometrics, online advertising, endorsements and testimonials in advertising and social media, the collection of personal information from children and students online, e-mail marketing, disclosures of video viewing information, and new technologies.

Lindsey also assesses privacy and data security risks in complex corporate transactions where personal data is a critical asset or data processing risks are otherwise material. In light of a dynamic regulatory environment where new state, federal, and international data protection laws are always on the horizon and enforcement priorities are shifting, she focuses on designing risk-based, global privacy programs for clients that can keep pace with evolving legal requirements and efficiently leverage the clients’ existing privacy policies and practices. She conducts data protection assessments to benchmark against legal requirements and industry trends and proposes practical risk mitigation measures.