Recently, an obscure provision of the Communications Act has been thrust into the limelight.  Section 230 is now discussed nearly daily on talk shows and in newspapers, and is gaining increasing criticism from the courts. The result of this attention is that Congress is now working on reforming Section 230, with a bevy of bills proposed that could dramatically affect the free and open Internet, with the consequences difficult to predict.

First, as a bit of background, Section 230 has two main subsections. The first subsection is direct:

No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.

47 U.S.C. § 230(c)(1). In the words of the Fourth Circuit, Section 230 “creates a federal immunity to any cause of action that would make service providers liable for information originating with a third-party user of the service. Specifically, § 230 precludes courts from entertaining claims that would place a computer service provider in a publisher’s role. Thus, lawsuits seeking to hold a service provider liable for its exercise of a publisher’s traditional editorial functions—such as deciding whether to publish, withdraw, postpone or alter content—are barred.” Zeran v. Am. Online, Inc., 129 F.3d 327, 330 (4th Cir. 1997).

The second subsection protects content moderation. Again, the relevant language is direct:

No provider or user of an interactive computer service shall be held liable on account of . . . any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected[.]

47 U.S.C. § 230(c)(2). In the words of the Second Circuit, “the provision explicitly provides protection from suit for restricting access to content that providers ‘consider[ ] … otherwise objectionable,’ even if the material would otherwise be constitutionally protected, granting some degree of subjective discretion to service providers who restrict the availability of content in good faith.” Domen v. Vimeo, Inc., No. 20-616, 2021 WL 3072778, at *4 (2d Cir. July 21, 2021).

These two subsections create powerful immunities for platforms: they cannot be held liable for the material their users post and they are incentivized to take down objectionable material. For 22 years, courts interpreted Section 230 broadly and the provision did not draw much public attention. In 2018, though, Congress amended Section 230.  In the Fight Online Sex Trafficking Act and the Stop Enabling Sex Traffickers Act (“FOSTA/SESTA”), Congress amended Section 230 to allow victims of sex trafficking to bring federal civil claims against their traffickers. To date, this has been the biggest change to Section 230; there has been little case law, though, on how FOSTA/SESTA will affect Section 230’s broad immunity.

But now Congress is getting ready to amend Section 230 again. Senators from both parties have introduced acts to reform Section 230 but only one bill has bipartisan support. Senator Rubio introduced the DISCOURSE Act which would include a religious liberty exemption to Section 230’s immunity, clarifying the “otherwise objectionable” standard, and removing immunity for algorithmic amplification. Senators Warner, Hirono, and Klobuchar introduced the SAFE TECH Act to modify Section 230 regarding its interplay with civil rights, victim rights, and antitrust laws, among other reforms. Senator Klobuchar also recently introduced the Health Misinformation Act that would strip the Section 230 immunity from companies that promote health misinformation on their websites using algorithms. Senators Schatz and Thune re-introduced the bipartisan PACT Act which focuses on increasing transparency for platforms’ moderation choices and removing Section 230 immunity for materials that a court found illegal. To date, the none of these bills have received a hearing or markup.

On July 28, 2021, the House Energy and Commerce Republicans released a slate of 32 bills to “hold Big Tech accountable.” The bills cover a wide variety of topics, ranging from ID verification of users to studying whether big tech firms should contribute the universal service fund. The bulk of the bills, though, target Section 230 in one way or another. The bills target companies that have revenue greater than $3,000,000,000 in revenue or more than 300,000,000 monthly active users. For those companies, the bills further target Bad Samaritans for companies that knowingly promote illegal content, that allegedly discriminate on the basis of racial, sexual, political affiliation, or ethnic grounds, or that allow doxxing. Intriguingly, one bill would potentially codify net neutrality protections. Additionally, many of the bills would allow FTC enforcement or mandate that covered companies submit reports to the FTC regarding content moderation policies.


The slew of bills targeting Section 230 indicates that some sort of legislative action is likely to come, although it is nearly impossible to predict what it will look like as the two parties disagree fundamentally on what is the problem with Section 230. The Republicans believe that internet platforms are taking down too much content while Democrats believe that internet platforms are leaving up too much content. A potential compromise, though, could focus on enhancing transparency in content moderation decisions, perhaps through mandatory disclosures of takedown actions and allowing an opportunity to appeal.

Congress should be wary before acting, though, as the legacy of FOSTA/SESTA shows that tinkering with Section 230 can lead to disastrous consequences. Recently, the GAO released a report finding that FOSTA/SESTA has rarely been used and may have actually made it more difficult to prosecute sex trafficking crimes. Even before this study, some in Congress sought a study on the effect of FOSTA/SESTA and questioned the overall effect of FOSTA/SESTA.