John Roach, Esq. | February 23, 2026 | All Things Trial
The Meta Addiction Lawsuit Trial: A Landmark Battle Over Social Media’s Impact on Young Minds
A Los Angeles Superior Court jury is currently hearing evidence in what may be the most consequential product liability trial involving a technology company in American history. The case — a bellwether trial in the coordinated litigation against Meta Platforms and Google — tests whether social media platforms can be held legally accountable as defective products for allegedly engineering addictive features that harm children’s and teenagers’ mental health. The trial began in early February 2026 before Judge Carolyn B. Kuhl.
I follow cases like this closely because they define where product liability law is heading — and because the theory at the center of this litigation, that a product can cause serious psychological harm through deliberate design choices, has direct implications for how courts evaluate corporate accountability for harm more broadly. As a San Francisco personal injury attorney who handles traumatic brain injury cases and other serious injury claims, the evidentiary framework being developed in this trial is worth understanding in detail.
The Litigation Landscape
This bellwether trial is the first major jury proceeding in a wave of thousands of coordinated cases — including over 2,000 in federal multidistrict litigation and additional claims in California’s JCCP 5255 proceedings. TikTok and Snap have already settled related individual claims for undisclosed sums. Meta and Google are now facing the courtroom test that the rest of the industry has been watching.
Bellwether trials in mass litigation serve the same function as the Jaylynn Dean v. Uber trial in the rideshare sexual assault MDL — they test key liability theories, produce jury verdicts that calibrate settlement expectations across thousands of pending cases, and often trigger industry-wide changes in how companies design and market their products. A plaintiff verdict here would substantially increase settlement pressure across the full docket.

The Core Theory: Defective Design Through Addictive Engineering
The plaintiffs’ theory is that platform features — infinite scroll, autoplay videos, push notifications, personalized recommendation algorithms, likes, streaks, and beauty filters — function like slot machines. They deliver unpredictable rewards that trigger dopamine responses and foster compulsive use. The argument is that these features were deliberately engineered to exploit the developing brains of children and teenagers, who lack fully formed impulse control and are especially vulnerable to behavioral addiction.
The suits allege that Meta and Google knew about these risks through internal research but prioritized engagement metrics and revenue over user safety. Specifically, the companies allegedly failed to enforce age restrictions, failed to warn parents and users about addiction risks, and failed to curb algorithmic amplification of harmful content including self-harm promotion, cyberbullying, unrealistic body image content, and sextortion material.
Meta’s defense rests on several arguments: there is no formal scientific consensus classifying social media use as a diagnosable addiction comparable to gambling or substance use; the alleged harms stem from user-generated content potentially protected by Section 230 immunity rather than platform design; and the company has implemented safety tools including time limits, parental controls, and improved underage detection — though plaintiffs characterize these measures as inadequate and largely ineffective.
The Plaintiff: K.G.M.
The bellwether plaintiff is a now-20-year-old California woman identified in court documents as K.G.M. She began using YouTube around age 6 and Instagram around age 9 or 10. Her lawsuit claims that compulsive use of these platforms led to addiction-like behavior, worsened depression and anxiety, body dysmorphia, self-harm, suicidal ideation, cyberbullying, and sextortion. Her mother has been involved in supporting the case throughout.
K.G.M.’s case serves as the representative test for thousands of similar claims from families, individuals, school districts, and state attorneys general alleging broader public health harms from social media platform design.
Zuckerberg’s Testimony: Key Moments
The trial’s second week produced its most significant moments when Meta CEO Mark Zuckerberg testified on February 18 — his first jury appearance on child safety issues. Plaintiffs’ attorney Mark Lanier confronted Zuckerberg with a 2020 internal Meta document revealing that 11-year-olds were four times more likely to continue using the platforms than older users, raising the inference that the company was deliberately cultivating underage engagement.

Zuckerberg repeatedly denied that Meta allows or targets children under 13, insisting the company has strict policies and has improved detection mechanisms. He pushed back against accusations that he misled lawmakers, claiming plaintiffs’ counsel was mischaracterizing internal communications, and defended the company’s shift away from maximizing screen time toward what he described as “meaningful social interactions.” Plaintiffs countered that the algorithms still prioritize engagement patterns that harm youth mental health regardless of how that engagement is labeled internally.
Expert testimony and unsealed internal documents compared platform mechanics to gambling design, presenting evidence that the reward structures rewire developing brains in ways that make voluntary disengagement difficult. Internal Meta research indicated that parental controls have limited impact on compulsive teen use, and that children with prior trauma are disproportionately affected.
What This Trial Means for Personal Injury Law
From a product liability standpoint, this trial is testing whether “addiction by design” is a viable theory for holding technology companies accountable for psychological harm — in the same framework that holds pharmaceutical companies accountable for concealing known drug risks, or automakers accountable for knowingly defective vehicle designs. The legal infrastructure being built in this litigation — the expert frameworks, the internal document discovery, the damages models for psychological harm — will have applications well beyond social media.
For families dealing with children who have suffered serious mental health consequences from social media use, the practical advice is straightforward: document usage patterns and timelines, preserve mental health records, and consult an attorney promptly. The statute of limitations in California personal injury cases is generally two years from the date of injury — and in cases involving minors, the clock typically does not start until the child turns 18. Time to act is not unlimited.
The trial is expected to continue for several more weeks. I will post updates in the All Things Trial category as significant developments occur. If you have questions about a personal injury claim in San Francisco or the Bay Area, call me at (415) 851-4557 for a free consultation.
Frequently Asked Questions: Social Media Addiction Lawsuits
Potentially yes. The ongoing litigation against Meta and Google is testing the legal theory that social media platforms are defective products that cause foreseeable psychological harm through deliberately addictive design. If a plaintiff verdict is returned in the current bellwether trial, it will significantly strengthen the legal framework for individual family claims. Consult a personal injury attorney to evaluate the specific facts of your child’s situation, the timeline of harm, and applicable statutes of limitations.
A bellwether trial is a test case selected from a large group of similar lawsuits to be tried first. The verdict provides both sides with a realistic assessment of how juries evaluate the key liability theories and damages evidence, which directly influences settlement negotiations across the broader case pool. In the Meta and Google litigation, this bellwether trial is the first major jury test of whether social media platforms can be held liable as defective products for harm to children.
Section 230 of the Communications Decency Act generally protects online platforms from liability for content posted by third-party users. Meta and Google have argued that the alleged harms stem from user-generated content protected by Section 230. Plaintiffs counter that the claims are based on platform design decisions — the algorithms, features, and mechanics the companies themselves built — not on the content users post. Whether platform design is protected by Section 230 is one of the central legal questions this litigation is testing.
California’s general personal injury statute of limitations is two years from the date of injury under Code of Civil Procedure Section 335.1. For claims involving minors, the statute is typically tolled until the child turns 18, at which point the two-year period begins. A child harmed at age 12 would generally have until age 20 to file a lawsuit. Contact an attorney promptly to evaluate the specific deadlines that apply to your situation.
Key evidence includes records of the child’s platform usage history and account activity, mental health records documenting the onset and progression of symptoms, school records showing academic and behavioral changes, communications within the platforms documenting bullying or harmful content exposure, and medical expert testimony linking the platform use to the specific psychological harm. Preserving this evidence promptly is important — platform accounts can be deleted and records may become harder to obtain over time.
Disclaimer: This blog post is for informational purposes only and does not constitute legal advice. Consult a licensed attorney for advice specific to your situation.