Continuing the flood of recent proposals to amend the scope of Section 230 of the 1996 Communications Decency Act, a bipartisan group of Senators unveiled the Platform Accountability and Consumer Transparency Act (“PACT Act”) last week.  Proposed by Senators Brian Schatz (D-HI) and John Thune (R-SD), the PACT Act comes on the heels of legislative proposals from Senate Republicans, a Department of Justice report with proposed amendments to the law—both of which we analyzed here—and the Trump Administration’s executive order on Section 230.

The PACT Act would amend Section 230 to include an “intermediary liability standard.”  Under this standard, Section 230’s immunity protections would no longer apply to any interactive computer service provider that “has knowledge” of any illegal content or illegal activity occurring on their service and that “does not remove the illegal content or stop the illegal activity within 24 hours of acquiring that knowledge.”  “Knowledge” of the illegal content requires notification in writing that identifies the illegal content or activity with “information reasonably sufficient to permit the provider to locate” it, as well as a copy of the Federal or State court order determining such content violated Federal law or State defamation law.

Additionally, the proposal would impose a number of transparency and process requirements on providers of interactive computer services.  Chief among these is the requirement for providers to publish an acceptable use policy that “reasonably inform[s] users about the types of content that are allowed” on their site; “explain[s] the steps the provider takes to ensure content complies with the acceptable use policy”; and, explains how users “can notify the provider of potentially policy-violating content, illegal content, or illegal activity,” which shall include a toll-free number, email address, and a “Complaint System.”

The Complaint System in the proposal would be a mechanism for users to submit complaints “in good faith” regarding illegal or violative content or any “decision of the interactive computer service provider to remove content posted.”  When a provider receives a notice about illegal content or illegal activities, they “shall remove the content or stop the activity within 24 hours of receiving that notice, subject to reasonable exceptions based on concerns about the legitimacy of the notice.”  Regarding policy-violating content or “removals based on moderation decisions,” the provider has up to 14 days to review the content and determine whether such content complies with its acceptable use policy.  If the interactive computer service provider chooses to remove the content based on the user complaint, they must “notify the information content provider and the complainant of the removal and explain why the content was removed” and provide the opportunity to appeal the determination.  The interactive computer service provider is not required to take down the content if they are unable to contact the content provider “after taking reasonable steps to do so” or if they know that the content “relates to an ongoing law enforcement investigation.”

Providers are also required to publish a “quarterly transparency report,” detailing the number of instances in which content was flagged by user complaints or internally by employees or internal automated detection tools.  This report must be categorized by “the category of rule violated,” “the source of the flag, including government, user, internal automated detection tool,” “the country of the information content provider,” and any coordinated campaign, if applicable.  The report must also include the number of appeals of decisions and when such appeals resulted in restoration of content, as well as a description of the tools used in enforcing the acceptable use policy.

Small business providers—defined as having fewer than one million monthly active users or visitors and less than $25 million accrued revenue over the most recent 24-month period—and internet infrastructure services are exempted from many of the Act’s requirements.  In terms of enforcement, the Act would provide the Federal Trade Commission with enforcement authority by treating violations of the Act as unfair or deceptive acts or practices, and the Act expressly applies to both for profit and non-profit entities.  The Act also enlarges federal enforcement authority under the Communications Decency Act by providing the Attorney General the ability to enforce both federal criminal statutes as well as federal civil statutes.  Further, state attorneys general would also be given enforcement authority for violations premised on any civil law of their state.