On March 5, Senators Ed Markey (D-MA) and Richard Blumenthal (D-CT) introduced the Kids Internet Design and Safety (KIDS) Act.  The bill, which covers online platforms directed to children and teenagers under 16 years old, aims to curb the time spent by these minors on such platforms and could dramatically affect advertising and influencer content on kids’ channels.

The bill would prohibit platforms directed to minors from implementing features that encourage users to spend more time online, such as “auto-play” settings that automatically load a new video once the selected one finishes playing, push alerts that encourage users to engage with the platform, and the display of positive feedback received from other users.  It would also ban badges or other visual incentives and rewards based on engagement with the platform.

Additionally, the KIDS Act would prohibit platforms from recommending or amplifying certain content involving sexual, violent, or other adult material, including gambling or “other dangerous, abusive, exploitative, or wholly commercial content.”  The bill would require the implementation of a mechanism for users to report suspected violations of content requirements.
Continue Reading New Bill Seeks to Impose Design Restrictions on Kids’ Online Content and Marketing

On January 30, House Rep. Kathy Castor (D-FL) introduced the Protecting the Information of our Vulnerable Children and Youth (“PRIVCY”) Act, a bill that promises to be a significant overhaul of the Children’s Online Privacy Protection Act (“COPPA”).

Currently, COPPA applies only to personal information collected from children under 13 years old.  The PRIVCY Act would greatly expand COPPA’s scope by making any personal information – including biometric, geolocation, and inferred information, whether collected from the child or not – subject to the law’s requirements.  It also brings a new group of “young consumers” – individuals aged 12 to 18 years old – under the law’s umbrella.  The PRIVCY Act would obligate online sites and services that have actual or constructive knowledge that they “process” personal information about children or young consumers to provide notice to, and obtain consent from, those children’s parents or from those young consumers.  The bill also provides for rights to access, correction, and deletion of children’s and young consumers’ personal information, and it imposes limits on the ability of operators to disclose personal information to third parties.

Additionally, the privacy bill would completely repeal COPPA’s safe harbor provision, which enables covered operators to rely on a safe harbor if their privacy practices have been certified by FTC-approved organizations.  Currently, seven safe harbor organizations have been approved by the FTC.
Continue Reading Kids’ Privacy Bill Allowing for Private Suits Introduced in House

On October 22, 2019, the Federal Trade Commission reached a proposed settlement with the developer of three so-called “stalking” apps that enabled purchasers of the app to secretly monitor the mobile devices on which they were installed.  Developer Retina-X Studios, LLC and its owner James N. Johns marketed the three apps—MobileSpy, PhoneSheriff, and TeenShield—as a means to monitor children and employees by sharing detailed information about these individuals’ smart phone activities, including their text messages and GPS locations.  The FTC complaint alleges that the developer failed to ensure that the apps would be used for legitimate and lawful purposes, did not secure personal information collected from children and other users, and misrepresented the extent to which that information would be kept confidential.

While the FTC settlement represents its first case against developers of tracking apps, the complaint’s allegations rely on provisions of the FTC Act that are broadly applicable to companies that collect, store, and/or monitor users’ personal information, as well as the Children’s Online Privacy Protection Act (“COPPA”): 
Continue Reading FTC Reaches Settlement with Developer of Tracking Apps

Yesterday, the Federal Trade Commission (“FTC”) and the New York Attorney General’s office (“NYAG”) settled allegations against Google LLC and its subsidiary YouTube, LLC claiming violations of the Children’s Online Privacy Protection Act and its implementing rule (together, “COPPA”).  The settlement requires Google and YouTube to pay $136 million to the FTC and $34 million to the NYAG for a total penalty almost 30 times higher than the largest COPPA penalty previously imposed.

Overview of the Complaint and Order

The joint FTC-NYAG complaint alleged that Google and YouTube collected personal information from children under 13 online and used that information to deliver online behavioral advertising, without first providing notice or obtaining verifiable parental consent as required by COPPA.  More specifically, the complaint alleged that Google and YouTube had actual knowledge that certain YouTube channels were child-directed but nevertheless collected persistent identifiers in the form of cookie and advertising identifiers to serve behavioral advertising to viewers of those channels.

In addition to requiring the $170 million total civil penalty and enjoining future COPPA violations, the settlement order requires “fencing-in” relief—which is relief in the form of injunctive provisions that go beyond what is required under existing law.  The order requires that YouTube and Google establish a system on YouTube that requires channel owners to self-designate whether the content they upload is child-directed.  For videos designated as child-directed, YouTube will not collect persistent identifiers for behavioral advertising.  The order further requires that Google and YouTube implement a training program for employees about the system and about COPPA’s requirements overall.  Finally, it imposes compliance reporting and recordkeeping requirements.

The settlement is notable both for what it does—and doesn’t—establish:
Continue Reading FTC and New York Attorney General Reach $170 Million Settlement Against Google and YouTube for Alleged Children’s Privacy Violations

On May 31, 2019, the Cyberspace Administration of China (“CAC”) released the draft Regulation on the Protection of Children’s Personal Information Online (“Draft Regulation”) for public comment. (An official Chinese version is available here and an unofficial English translation of the Draft Regulation is available here.) The comment period ends on June 30, 2019.

As mentioned in our last blog post (available here), CAC issued the draft Measures for Data Security Management (“Draft Measures”) just last week, which set out the general regulatory framework that will govern the collection and use of personal information by network operators (broadly defined as “owners and managers of networks, as well as network service providers”). The release of this new Draft Regulation demonstrates CAC’s intention to set out more stringent requirements for network operators if they collect, store, use, transfer or disclose the personal information of minors under 14 years old. We discuss the key requirements of the Draft Regulation in a greater detail below.


Continue Reading CAC Releases Draft Regulation on the Protection of Children’s Personal Information Online

Earlier this month, the UK’s Information Commissioner’s Office published a draft code of practice (“Code”) on designing online services for children. The Code  is now open for public consultation until May 31, 2019. The Code sets out 16 standards of “age appropriate design” with which online service providers should comply when designing online services (such as apps, connected toys, social media platforms, online games, educational websites and streaming services) that children under the age of 18 are likely to access. The standards are based on data protection law principles, and are legally enforceable under the GDPR and UK Data Protection Act 2018. The Code also provides further guidance on collecting consent from children and the legal basis for processing children’s personal data (see Annex A and B of the Code). The Code should be read in conjunction with the ICO’s current guidance on children and the GDPR.
Continue Reading ICO issues draft code of practice on designing online services for children

The Article 29 Working Party (WP29) has published long-awaited draft guidance on transparency and consent under the General Data Protection Regulation (“GDPR”).  We are continuing to analyze the lengthy guidance documents, but wanted to highlight some immediate reactions and aspects of the guidance that we think will be of interest to clients and other readers of InsidePrivacy.  The draft guidance is open for consultation until 23 January 2018.

Continue Reading EU Regulators Provide Guidance on Notice and Consent under GDPR

Earlier this week, the Federal Trade Commission and Department of Education announced plans to hold a joint workshop on the application of the Children’s Online Privacy Protection Act (“COPPA”) and the Family Educational Rights and Privacy Act (“FERPA”) to educational technology products and services in the K-12 school environment.  In advance of the workshop, the FTC and Department of Education are soliciting comments on several key questions regarding COPPA and FERPA compliance for educational technology providers.  This is a valuable opportunity for Ed Tech providers to provide feedback to both agencies on the practical application of COPPA and FERPA in this arena.

Continue Reading FTC and Department of Education Announce Joint Workshop on FERPA and COPPA Compliance for Ed Tech

The FTC staff published today a “Six-Step Compliance Plan” for businesses to comply with the Children’s Online Privacy Protection Act (COPPA).

The guidance, which provides a useful framework for businesses, states explicitly that COPPA applies to connected toys and other devices that collect personal information from children over the Internet.  The FTC’s 2013