In Massachusetts, Section 230 Does Not Immunize Meta From Claims That Instagram’s Design Features Injure Children
Client Alert | 4 min read | 04.15.26
Meta continues to face lawsuits around the country alleging that its platforms are designed to induce compulsive use by children. In March 2026, a California jury delivered a landmark verdict that Meta and YouTube were liable for allegedly addictive platform features that resulted in a child’s mental health distress.
On April 10, 2026, Massachusetts’ Supreme Judicial Court, the Commonwealth’s highest court, allowed a similar suit brought by the attorney general of the Commonwealth of Massachusetts against Meta to proceed. The court removed one defense Meta leveled against such claims when it affirmed a trial court’s decision rejecting Meta’s Section 230 immunity argument.
The Commonwealth has brought claims against Meta as owner of Instagram claiming that Meta is liable for unfair business practices due to Instagram design features that “exploit” a young user’s “neurological vulnerability to social media addiction.” The Commonwealth further alleges that Meta is liable for deceptive and unfair business practices because Meta claimed to have age-gating technology that prevented younger users from using Instagram. The Commonwealth alleges that this was not so and, in fact, Meta refused to invest in effective methods to prevent underage use. Finally, the Commonwealth claims that Instagram is a public nuisance because it addicts children to its use.
In response, among other things, Meta moved to dismiss these claims asserting that Section 230(c)(1) of the Communications Decency Act, 47 U.S.C. § 230(c)(1) immunizes it from these claims. Congress enacted Section 230(c)(1) to protect “interactive computer service providers”—platforms like Meta, X, Instagram, and Bluesky—from claims premised on content generated by a third party that is posted on their platforms. Described as the “twenty-six words that created the internet,” Section 230(c)(1) has been routinely applied to provide immunity from civil liability based on content posted by a third party on a platform’s site.
To warrant Section 230(c)(1) protection, the court noted that a defendant must show (1) that it is a provider or user of an “interactive computer service,” (2) that the claim would “treat the defendant as the publisher or speaker of [the challenged] information,” and (3) that the information was provided by another “information content provider.” To state it another way, as did the court, Meta needed to show that it operated a platform and the lawsuit was based on content provided by a third party. In such a case, immunity applies regardless of “the form of the asserted cause of action [and] regardless of how a claim is pleaded …”
There was no dispute that Meta satisfied the first element. It was the second two that the court found wanting.
Meta claimed that it met the second prong of the test because its choices were those of a “publisher,” which include how, when, for how long, and to whom to publish information. So, in effect, the Commonwealth was suing Meta “as the publisher.” The court recognized that this broad construction of Section 230’s immunity finds some support in case law. Courts, albeit not universally, have concluded that things like algorithm design intended to show content aligned with a user’s interest or anonymization policies that might allow for sex trafficking are the acts of a “publisher” and therefore fall within the ambit of Section 230(c)(1)’s protections.
Having recognized that some courts have applied the law’s immunity broadly, the court rejected that as a valid interpretation of Section 230(c)(1), pointing to another line of cases with a narrower interpretation of its scope. The court found that Section 230(c)(1) was not intended to protect “publishing activity” in the broad terms urged by Meta. Rather, relying on the plain meaning of Section 230(c)(1) and its legislative history, the court found that the purpose of Section 230(c)(1) was to disrupt common-law publisher liability — where the publisher is liable for publishing the content of a third party. Narrowing the focus to common-law principles, a claim would “treat the defendant as a publisher,” as that phrase is used in Section 230(c)(1) when “the publisher” (here, Meta) is being held liable for publishing the content of a third party. In the court’s words, “Congress intended to preserve liability against the original author … but to eliminate ‘the separate route of imposing tort liability on companies that serve as intermediaries …’”
Given this narrower reading of Section 230(c)(1), the court concluded that Section 230 does not immunize when a provider contributes to or creates, in whole or in part, the challenged content. Applying this interpretation, the court concluded that Meta contributed to the challenged features.
As for the third prong — that Meta was publishing third-party content — the court concluded that Meta’s showing fell short. The court found that the Commonwealth’s unfair business practice claim was seeking to hold Meta liable for Instagram design features that promoted or contributed to addiction. None of this was third-party content. In so holding, the court rejected Meta’s argument that third-party content was necessarily implicated because that is what the design elements facilitated. The court explained that Section 230 focuses on whether the harm is claimed to arise from third-party content, and that was not the case here.
Section 230 — its metes and bounds, its application to generative AI, and proposals to reform or repeal it — has been in the news on and off for years. It will continue to be newsworthy as Meta and other platforms face a rash of claims alleging that their platforms are addictive and harm children. While some courts across the country are taking a position that is similar to the position of the Massachusetts Supreme Judicial Court, many courts are not. And at some point, the law’s precise scope (and application to evolving technologies) will require resolution by the courts or by Congress. Moreover, this decision, and case more generally, illustrates a growing trend of State AG intervention with respect to content online, Section 230 immunity, and enforcement and regulation of AI.
Contacts
Insights
Client Alert | 2 min read | 04.15.26
Who Invented That? When AI Writes the Code, Patent Validity Issues May Follow
In Fortress Iron, LP v. Digger Specialties, Inc., No. 24-2313 (Fed. Cir. Apr. 2, 2026), the U.S. Court of Appeals for the Federal Circuit reaffirmed what happens when a patent incorrectly lists the true inventors, and that error cannot be corrected under 35 U.S.C. § 256(b), which requires notice and a hearing for all “parties concerned.” In Fortress, the patent owner sought judicial correction to add an inventor under § 256(b), but that inventor could not be located. Because the missing inventor qualified as a “concerned” party under the statute, the lack of notice and a hearing for that inventor made correction under § 256(b) impossible, and the patents could not be saved from invalidity.
Client Alert | 3 min read | 04.14.26
Client Alert | 4 min read | 04.14.26
FedRAMP Solicits Public Comment on Overhaul to Incident Communications Procedures
Client Alert | 5 min read | 04.14.26




