The Department of Justice has released a draft bill to amend Section 230 of the Communications Decency Act of 1996, joining the chorus of voices seeking to limit the statute’s liability protections (covered here, here, here, and here).  The DOJ’s draft bill incorporates recommendations from its June 2020 report analyzing Section 230, as well as President Trump’s Executive Order on Preventing Online Censorship.  According to Attorney General William Barr, DOJ’s proposal “recalibrates Section 230 immunity,” aiming to “incentivize online platforms to better address criminal content on their services and to be more transparent and accountable when removing lawful speech.”

Currently, Section 230 mandates that “[n]o provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”  In addition, providers and users of interactive computer services cannot be held civilly liable for “any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected.”  Section 230 also provides immunity from civil liability for “any action taken to enable or make available to information content providers or others the technical means to restrict access to such materials.”  Section 230 provides no immunity from federal criminal statutes.

In a letter to Congress, Attorney General Barr explains that the DOJ’s proposed amendments fall into two categories.  First, the draft bill purportedly intends to “clarif[y] the scope of immunity as applied to content moderation decisions to ensure that platforms cannot hide behind the shield of Section 230 to censor lawful speech in bad faith and inconsistent with their own terms of service.”  To that end, Section 230’s instruction that interactive computer services will not be considered publishers or speakers “shall not apply” to any action “restrict[ing] access to or availability of material provided by another information content provider”; rather, such actions may only be immune to civil liability if the material in question falls under the enumerated categories of content that may be restricted in good faith.  However, the bill also adds that no provider or user will be considered a publisher or speaker solely due to actions taken in good faith against content that “the provider or user has an objective reasonable belief violates its terms of service or use.”

Further, the proposal alters the good faith provision of Section 230, requiring providers and users of interactive computer services to have “an objectively reasonable belief” that the content in question is “obscene, lewd, lascivious, filthy, excessively violent, promoting terrorism or violent extremism, harassing, promoting self-harm, or unlawful.”  This language adds three categories of content not currently listed in Section 230—“promoting terrorism or violent extremism,” “promoting self-harm,” and “unlawful”—and removes the important catch-all category of “otherwise objectionable” content, which currently permits removal of content that violates providers’ terms of use or community guidelines.  The bill also adds a definition of good faith, which requires providers to publish public terms of service, moderate content in a manner that is consistent with those terms, and provide notice to content creators regarding moderation decisions.

The proposal also expands the definition of “information content providers” not entitled to Section 230 immunity.  Currently, this category includes any entity that is “responsible, in whole or in part, for the creation or development of information.”  The draft bill adds that this includes “instances in which a person or entity solicits, comments upon, funds, or affirmatively and substantively contributes to, modifies, or alters information provided by another person or entity.”

The second category of amendments, as Attorney General Barr explains, “is aimed at incentivizing platforms to address . . . illicit content online.”  The proposal creates three exceptions from Section 230 immunity: (1) platforms acting “purposefully and with the conscious intent to promote, solicit, or facilitate” content that the provider knew violated federal criminal law, (2) platforms hosting content that violates federal criminal law, if they have actual knowledge of the content and fail to take certain actions enumerated in the statute, and (3) platforms that do not remove content after receiving judicial notice that it is defamatory or otherwise unlawful.

Finally, the DOJ’s proposal creates several new carve-outs to the scope of Section 230 immunity: the statute’s protections would not apply to civil enforcement actions brought by the federal government, nor to claims of child exploitation and sexual abuse, terrorism, and cyber-stalking, in addition to the current carve-out for sex trafficking.

Print:
Email this postTweet this postLike this postShare this post on LinkedIn
Photo of Kurt Wimmer Kurt Wimmer

Kurt Wimmer is a partner concentrating in privacy, data protection and technology law.  He advises national and multinational companies on privacy, data security and technology issues, particularly in connection with online and mobile media, targeted advertising, and monetization strategies.  Mr. Wimmer is rated…

Kurt Wimmer is a partner concentrating in privacy, data protection and technology law.  He advises national and multinational companies on privacy, data security and technology issues, particularly in connection with online and mobile media, targeted advertising, and monetization strategies.  Mr. Wimmer is rated in the first tier by Legal 500, designated as a national leader in Chambers USA, and is included in Best Lawyers in America in four categories.  He represents companies and associations on public policy matters before the FTC, FCC, Congress and state attorneys general, as well as in privacy assessments and policies, strategic content ventures, copyright protection and strategy, content liability advice, and international matters.

Photo of Chloe Goodwin Chloe Goodwin

Chloe Goodwin is a litigator and regulatory attorney focused on privacy and technology issues. She represents several leading technology companies in litigation and compliance matters relating to electronic surveillance, law enforcement access to digital evidence, cybersecurity, and data privacy.