On September 29, 2021, the Senate Committee on Commerce, Science, and Transportation held a hearing entitled “Protecting Consumer Privacy.”  The hearing centered on strengthening consumer privacy rights, including by increasing the FTC’s resources and creating a comprehensive federal privacy law.

To explore these issues, the Committee invited David Vladeck, Professor and Faculty Director of the Center on Privacy and Technology at Georgetown Law and former Director of the FTC Bureau of Consumer Protection; Morgan Reed, President of The App Association; Maureen Ohlhausen, Partner and Section Chair (Antitrust & Competition Law) at Baker Botts and former Acting Chairman of the FTC; and Ashkan Soltani, Independent Researcher and Technologist and former Chief Technologist of the FTC.
Continue Reading Consumer Privacy Hearing Focuses on Expanding FTC Resources, Creating Federal Privacy Law

To add to the growing number of bills that would amend or revoke Section 230 of the Communications Decency Act, last month Senator Amy Klobuchar (D-MN) introduced the Health Misinformation Act of 2021 (S.2448).  Senator Ben Lujan (D-NM) cosponsored the bill.

The bill would amend Section 230 to revoke the Act’s liability shield

To add to the growing list of federal privacy frameworks introduced this year, Senator Amy Klobuchar (D-MN) has re-introduced the bipartisan Social Media Privacy Protection and Consumer Rights Act of 2021 (S. 1667).  Senator Klobuchar introduced the bill originally in 2018 and 2019, although it did not advance to committee in either instance.  Senators Kennedy (R-LA), Burr (R-NC), and Manchin (D-WV) have co-sponsored the bill.

Key provisions in this bill include:
Continue Reading New Privacy Bill Provides Opt-Out Rights and New Data Security Requirements

As the push for Congress to pass comprehensive consumer privacy legislation increases, Rep. Suzan DelBene (D-WA) has re-introduced the Information Transparency & Personal Data Control Act, a compromise proposal that contains provisions sought by both parties.  This bill would create national data privacy standards and increase the enforcement authority of the Federal Trade Commission (FTC) and state attorneys general.
Continue Reading Bill Introduced Would Preempt State Laws and Strengthen FTC Enforcement 

A number of legislative proposals to amend Section 230 of the 1996 Communications Decency Act (“Section 230”) have already been introduced in the new Congress.  Section 230 provides immunity to an owner or user of an “interactive computer service” — generally understood to encompass internet platforms and websites — from liability for content posted by a third party.

On February 8, 2021, Senator Mark Warner (D-VA) introduced the Safeguarding Against Fraud, Exploitation, Threats, Extremism, and Consumer Harms Act (“SAFE TECH Act”), cosponsored by Senators Amy Klobuchar (D-MN) and Mazie Hirono (D-HI).  The bill would narrow the scope of immunity that has been applied to online platforms.  Specifically, the SAFE TECH Act would amend Section 230 in the following ways:
Continue Reading SAFE TECH Act Would Limit Scope and Redesign Framework of Section 230 Immunity

Senators Lindsey Graham (R-S.C.), Tom Cotton (R-Ark.) and Marsha Blackburn (R-Tenn.) have introduced the Lawful Access to Encrypted Data Act, a bill that would require tech companies to assist law enforcement in executing search warrants that seek encrypted data.  The bill would apply to law enforcement efforts to obtain data at rest as well as data in motion.  It would also apply to both criminal and national security legal process.  This proposal comes in the wake of the Senate Judiciary Committee’s December 2019 hearing on encryption and lawful access to data.  According to its sponsors, the purpose of the bill is to “end[] the use of ‘warrant-proof’ encrypted technology . . . to conceal illicit behavior.”

The bill has three main provisions:
Continue Reading Lawful Access to Encrypted Data Act Introduced

The Brazil Senate unanimously approved a bill today that would delay implementation of the Brazil General Law for Data Protection, or LGPD, until January 1, 2021 and enforcement of fines and penalties until August 1, 2021.  The LGPD is currently scheduled to take effect on August 15, 2020.

The draft bill — one of four pending in the Senate that propose to delay implementation of the LGPD — is broad in scope, encompassing not only the LGPD, but also statutes of limitations and sanctions for certain anti-competitive conduct.  Senator Antonio Anastasia, the sponsor of the bill, explained that the bill is intended to give businesses an opportunity to focus on other urgent matters arising from the COVID-19 pandemic.
Continue Reading Brazil Senate Approves Bill Delaying LGPD Enforcement

On March 5, Senators Ed Markey (D-MA) and Richard Blumenthal (D-CT) introduced the Kids Internet Design and Safety (KIDS) Act.  The bill, which covers online platforms directed to children and teenagers under 16 years old, aims to curb the time spent by these minors on such platforms and could dramatically affect advertising and influencer content on kids’ channels.

The bill would prohibit platforms directed to minors from implementing features that encourage users to spend more time online, such as “auto-play” settings that automatically load a new video once the selected one finishes playing, push alerts that encourage users to engage with the platform, and the display of positive feedback received from other users.  It would also ban badges or other visual incentives and rewards based on engagement with the platform.

Additionally, the KIDS Act would prohibit platforms from recommending or amplifying certain content involving sexual, violent, or other adult material, including gambling or “other dangerous, abusive, exploitative, or wholly commercial content.”  The bill would require the implementation of a mechanism for users to report suspected violations of content requirements.
Continue Reading New Bill Seeks to Impose Design Restrictions on Kids’ Online Content and Marketing

On February 14, 2020, California State Assembly Member Ed Chau introduced the Automated Decision Systems Accountability Act of 2020, which would require any business in California that provides a person with a program or device that uses an “automated decision system” (“ADS”) to establish processes to “continually test for biases during the development and usage of the ADS” and to conduct an impact assessment on that program or device.

ADS is defined broadly as “a computational process, including one derived from machine learning, statistics, or other data processing or artificial intelligence techniques, that makes a decision or facilitates human decision making, that impacts persons.”  The required ADS impact assessments would study the various aspects of the ADS and its development process, “including, but not limited to, the design and training data of the ADS, for impacts on accuracy, fairness, bias, discrimination, privacy, and security.”  At minimum, the assessments must include “[a] detailed description of the ADS, its design, training provided on its use, its data, and its purpose” and “[a]n assessment of the relative benefits and costs of the ADS in light of its purpose,” with certain factors such as data minimization and risk mitigation required in the cost-benefit analysis.

The provider of the ADS also must determine whether the ADS system “has a disproportionate adverse impact on a protected class,” examine whether it serves “reasonable objectives and furthers a legitimate interest,” and consider alternatives or reasonable modifications that could be incorporated “to limit adverse consequences on protected classes.”
Continue Reading California Introduces Bill to Regulate Automated Decision Systems