1. Home
  2. |Insights
  3. |Landmark Verdicts Against Meta and YouTube Signal New Era of Social Media Platform Liability

Landmark Verdicts Against Meta and YouTube Signal New Era of Social Media Platform Liability

Client Alert | 4 min read | 03.30.26

In two recent pathbreaking judgments, juries in California and New Mexico held social media companies civilly liable for harming minors who used their products.

On Tuesday, March 24, 2026, a New Mexico found Meta liable for failing to protect kids from child exploitation on its platforms and ordered the company to pay $375 million in damages for consumer-protection violations. The next day, a California jury found Meta and YouTube liable for platform features that cause children to become addicted to their websites and applications, resulting in mental health distress.

These successful lawsuits, alleging personal injury and negligent consumer-product design, are two of dozens brought in recent years by students, schools, and state attorneys general that are pending across the country. These verdicts and the damages awards may well spur more lawsuits against a wider swath of online companies, and even result in new legislation.

New Mexico – Jury Fines Meta Millions for Enabling Child Exploitation

In December 2023, New Mexico Attorney General Raúl Torrez sued Meta in New Mexico state court after his office conducted an undercover investigation into child sex trafficking markets on Facebook and Instagram.

Over a six-week jury trial, law enforcement and the National Center for Missing and Exploited Children (NCMEC) testified that Meta’s reporting of crimes to children on its apps, including child sexual abuse materials, was “deficient.” The jury heard testimony from NCMEC that Meta “generated high volumes of ‘junk’ reports by overly relying on AI [artificial intelligence] to moderate its platforms,” rendering its reporting “useless” and making it more difficult for law enforcement to investigate crimes. The state presented internal messages and documents and testimony from child safety experts to substantiate claims that Meta repeatedly ignored warnings and failed to take steps to protect children.

On March 24, 2026, after less than a day of jury deliberations, the jury found Meta liable for misleading consumers about the safety of its platforms and endangering children. The jury ordered Meta to pay the maximum penalty under the law of $5,000 per violation, totaling $375 million.

The case continues. A bench trial will commence on May 4 to consider whether Meta created a public nuisance and whether it must fund programs to address the harms caused. In addition, the state seeks an order requiring Meta to enact effective age verification, remove predators from the platform, and prevent minors from sending encrypted messages.

California – Jury Finds Meta and YouTube Negligent for Social Media Addiction

In July 2023, a now 20-year-old woman known in court papers as K.G.M. sued Meta and YouTube in Los Angeles County Superior Court, claiming that the design of their platforms caused her to use the platforms compulsively starting at the age of six. K.G.M. cited several features of Meta and YouTube as triggering addiction and compulsivity: infinite scroll, constant notifications, autoplay, algorithmic recommendations, and beauty filters. As a result of her compulsive use, K.G.M. alleged she was diagnosed with anxiety, depression, body dysmorphia, and suffers from suicidal ideations.

At trial, K.G.M. presented evidence of Meta’s and YouTube’s internal documents demonstrating that technology executives knew about and discussed the addictive effects of their products on children, as well as evidence that the companies deliberately designed their products to entice and hook young users.

After a month-long jury trial and nine days of deliberation, a nonunanimous jury found that Meta and YouTube negligently designed their platforms and awarded K.G.M. $3 million in damages, with Meta to pay $2.1 million and YouTube to pay $900,000. The jury also found that Meta and YouTube were malicious, fraudulent, and oppressive, and on account of that, awarded another $3 million in punitive damages.

Key Takeaways

These verdicts carry significant implications for consumer-facing digital platforms. Social media networks, gaming platforms, streaming services, online marketplaces, and/or messaging applications should take note of the following key takeaways:

1. Product Design Choices Need to Comply With User Safety Obligations

Companies should audit their product design choices for features that plaintiffs may characterize as compulsive or addictive and document the reasoning and user-safety considerations that motivated design decisions.

2. Age-Gating and Specific Protections for Minors Can Reduce Liability Risks

Platforms with minor users should review their age-verification mechanisms and consider whether current protections are adequate in light of the evolving litigation and regulatory environment. Companies should ensure that their terms and conditions require age verification, parental consent for underage users, and, where prudent to foreclose claims of negligent design, certain restrictions on personalized algorithm feeds.

In addition to the litigations discussed here, state lawmakers have in several states sought to pass laws to limit minors’ access to social media or allow parents to sue social media platforms for exposing children to certain harmful content, although with limited effect; many of these laws have been stayed or struck down on constitutional grounds. And, on February 4, 2026, California proposed Assembly Bill 1709 that would prohibit social media platforms from allowing individuals under 16 years of age to create or maintain an account and would require platforms to implement reasonable measures to prevent users under 16 from accessing or using accounts. Companies should stay abreast of this evolving landscape, because, if these laws were to survive challenges, companies would need to adapt accordingly.

3. Document Encryption Design Choices

During the New Mexico trial, testimony revealed that Meta set chats as encrypted by default despite internal warnings that this design choice would make it harder to investigate child predators. Platforms offering end-to-end encryption or other privacy-protective defaults should carefully assess the child safety implications of those settings and document their decision-making process.

Crowell & Moring will continue to monitor these legal developments. For further information, please contact our team.

Insights

Client Alert | 5 min read | 03.30.26

The EU Pharma Package: The Transferable Exclusivity Voucher Compromise Proposal

In our third alert in this EU Pharma Package Series, we provided a detailed overview of the diverging positions of the European Commission, the European Parliament , and the Council of the European Union on the transferable exclusivity voucher (TEV) for priority antimicrobials....