Section 230 Reform: What Websites Need to Know Now
Client Alert | 4 min read | 07.02.25
Section 230 of the Communications Decency Act of 1996 has been credited with “creating” the internet by immunizing websites and platforms from lawsuits arising from the content posted by third-party users. Specifically, an internet company is not liable for publishing or posting content drafted by another person under conventional common law tort theories such as defamation or slander, however loathsome, violent or otherwise hateful that content is. At the same time, Section 230 also immunizes a website or platform that engages in good-faith moderation of content it deems to violate its terms of use/conditions or community standards.
Politicians, Courts, Congress, Agencies, and State Attorneys General have called for Section 230 reform as follows:
-
- Republicans and Democrats on Capitol Hill seek reform or repeal of Section 230, but for different reasons. In general, Democrats complain that Section 230 has become too large a shield for “Big Tech” by immunizing them from lawsuits for harmful content that targets children. Republicans largely have been critical of the content-moderation provision of Section 230 because, in their view, it has been improperly exploited to enable social media platforms to deplatform users or remove content that expresses views they dislike for political or cultural reasons, giving rise to a “liberal” and left-leaning internet where conservative voices are disfavored or removed from the zeitgeist
- Under President Trump, the U.S. Department of Justice, Federal Trade Commission (FTC), and Federal Communications Commission (FCC) have publicly criticized “Big Tech” for engaging in “censorship” of conservative voices and have said, in the words of FCC Chair Brendan Carr, that they will “smash the censorship cartel.” Key FTC and FCC officials have (1) indicated that future enforcement under antitrust and consumer protection laws is likely, (2) openly expressed antipathy to the content moderation provision of Section 230 would challenge it in court.
- Bipartisan efforts and the FTC have been vocal in condemning websites and platforms for failing to either filter or moderate pornography online or remove (or otherwise make it more difficult to access) content that is dangerous to children.
- States have been promulgating various legislation, from requiring changes to the product design of apps and websites, to requiring social media companies to disclose content moderation choices and decisions. California, New York, and Texas have been among the most active in this area.
- A few members of the U.S. Supreme Court have signaled a willingness to revisit Section 230; others have chosen to sidestep questions relating to it, leaving the issue to the political and legislative process. For example, last year, a majority of the court declined to take up a case that called for an examination of the scope of Section 230, but, writing in dissent of the Court’s decision, Justices Clarence Thomas and Neil Gorsuch argued that social-media platforms have wrongly used the provision as a “get-out-of-jail free card.”
- In the meantime, the Court has ruled that social media companies’ algorithms and feed may be expressive compilations deserving First Amendment protection. At the same time, however, the Court has cautioned that First Amendment protection is not unlimited—there is a point at which technology companies’ reliance on algorithms, artificial intelligence, and other technological advances may strip their online expressions and products of those protections. For example, recently, the Court in Free Speech Coalition v. Paxton upheld a Texas law that required age verification for access to websites with sexually explicit content. And several others, notably Justice Gorsuch, have said in public that they believe Section 230’s protections do not extend to generative AI.
What does all this mean for your website or platform?
-
- For e-commerce and retail, Section 230 has traditionally protected platforms and sites from defects in products or content posted or listed by third parties, and the platform/site has not been responsible for regulatorily required content, such as California’s Prop 65. However, recent court cases in California may make the platform/site liable for the content and/or omissions in third-party provided material.
- Courts (especially in the U.S. Court of Appeals for the Ninth Circuit) have shown a willingness to enforce company’s Terms of Service as independent duties. Therefore, if you promise to moderate content, you might be liable to private plaintiffs if you do not do it according to your Terms. But there is still a healthy amount of skepticism towards that approach, even within the 9th Circuit.
For its part, however, the FTC has indicated its willingness to enforce Terms of Service, too; so, if Section 230 is repealed or reformed, this is an area where you can expect to see enforcement actions. Therefore, consider the following:
-
- Catalogue your user-generated content and understand how you moderate content. Make sure your content moderation decisions conform to your policies and terms.
- Understand what AI and technology you employ in your online and digital products. Conventional wisdom has been that the more AI your content moderation and third-party content deploys or uses, the more likely it to be protected by Section 230 as a mere re-publication and not an act of editing. Understand what your AI does – if it is creating entirely new content through a generative model, courts may conclude it does not receive Section 230 protections.
While Section 230 reform or repeal may still seem far away in the United States, foreign countries are providing the blueprint that the U.S. may follow. In June 2025, Brazil’s Supreme Court ruled that social media platforms are accountable for illegal user-generated content, marking a major shift in that country’s regulation of online content. Moreover, six of Brazil’s 11 justices backed fines for non-removal, putting the pressure on platforms to police their content. Will the US follow suit? Time will tell.
Insights
Client Alert | 4 min read | 07.02.25
Merger consent orders are back at the FTC, and the FTC’s most recent action showcases how the current leadership is analyzing divestiture proposals. Last week, the FTC approved a proposed consent agreement in Alimentation Couche-Tard Inc.’s (ACT) acquisition of retail fuel outlets from Giant Eagle, Inc. that paired standard retail divestitures with a “prior notice” requirement that ACT notify the agency of future acquisitions in certain markets regardless of size. This FTC has signaled greater acceptance of remedies than the prior administration, and this most recent consent puts that on display, with Commissioner Meador providing merging parties guidance on designing effective remedies.
Client Alert | 3 min read | 07.02.25
USPTO's Upcoming Changes to the Accelerated Examination Program
Client Alert | 2 min read | 07.01.25
DoD Establishes New DOGE Approval Process for ITC&MS and A&AS Contracts