For years, people have blamed social media for addiction, anxiety, and declining mental health. For over a decade, critics, parents, psychologists, and even former executives have accused social media platforms of fueling addiction, anxiety, body-image crises, and worse—especially among kids and teens.
Until now, those claims stayed in headlines, congressional hearings, documentaries, and app store reviews. But now, for the first time, that question isn’t being debated online.
👉 It’s being decided in a courtroom.
A jury in California has signaled that tech giants like Meta and YouTube could be held financially responsible for designing platforms that keep users hooked—especially children.
And that could change everything.
In March 2026, they’re finally inside a Los Angeles courtroom—before a jury of ordinary people deciding real liability and potential damages.
This isn’t abstract debate. It’s K.G.M. v. Meta Platforms & YouTube LLC—the first-ever jury trial accusing Big Tech of deliberately designing addictive products that harmed a child’s mental health. As deliberations stretch into their second week (with no verdict as of March 22, 2026), the case has already sent shockwaves through Silicon Valley.
Table of Contents
The Moment That Changed the Case
During deliberations, jurors asked a critical question: How should damages be calculated?
The Jury’s Big Question: Damages
After weeks of testimony, closing arguments, and private deliberations starting mid-March, jurors sent a pivotal note to Judge Carolyn Kuhl: How should damages be calculated?
Legally, this is seismic.
To even ask about compensation, the 12-person jury must have already found (or be leaning toward finding) that:
- The platforms caused harm
- Their design choices were a substantial factor
- The companies failed to warn users adequately or acted negligently
In civil trials like this, reaching the damages phase signals the jury has likely crossed the liability threshold—at least on some counts. Only nine jurors need to agree for a verdict.
No final decision yet—the panel adjourned over the weekend and continues—but the question alone is being read as a strong signal against Meta (Instagram/Facebook) and Google (YouTube).
A Childhood Shaped by Algorithms
At the center of the case is a young woman who claims her life was deeply shaped—and damaged—by social media.
A Childhood Rewired: The Plaintiff’s Story
At the heart is Kaley G.M. (K.G.M. in court docs), now 20, who testified emotionally about her decade-plus entanglement with Instagram and YouTube.
Key points from her testimony and supporting evidence:
- Exposure began around age 6 with YouTube videos
- Escalated to 10–15 hours daily by her teens
- Developed severe depression, self-harm, body dysmorphia, eating disorders, and suicidal ideation by age 10
- Felt compelled to chase likes, perfect her appearance, and compare herself endlessly
Her lawyers (led by Mark Lanier) called the platforms “digital casinos”—engineered for dopamine hits via:
- Infinite scroll
- Autoplay videos
- Algorithmic feeds pushing extreme/engaging content
- Notification loops and reward systems (likes, streaks)
Internal documents (leaked/whistleblower-sourced) allegedly show execs knew these features hooked young users but prioritized engagement metrics—and ad revenue—over safety.
The Core Allegation: Addiction by Design
Lawyers presented internal documents suggesting that platforms actively worked to increase engagement among young users.
The features under scrutiny include:
- Infinite scrolling
- Autoplay videos
- Like and reward systems
- Algorithm-driven content loops
These aren’t bugs.
👉 They’re business models.
The argument is simple but powerful:
👉 The longer you stay, the more the platform earns.
And if that leads to compulsive use?
That may no longer be dismissed as “user choice.”
The Defense: It’s Not That Simple
Meta and YouTube argue fiercely:
- Mental health issues stem from many factors (family dynamics, bullying, pre-existing conditions)
- Platforms are tools—harm isn’t inevitable for most users
- Features serve legitimate purposes (discovery, connection)
- Plaintiff might have struggled regardless
Experts testified that correlation doesn’t prove causation, and billions use these apps without severe harm.
Yet the jury’s damages query suggests at least some jurors aren’t buying the “it’s not us” defense fully.
Why This Bellwether Trial Could Reshape Everything
This is the first bellwether in a massive consolidated proceeding (JCCP 5255 in California state court + related federal MDL):
- Over 1,600 plaintiffs (350+ families, 250+ school districts)
- Targets Meta, Google/YouTube, with TikTok and Snap settling pre-trial for undisclosed sums
A plaintiff win could:
- Trigger settlements in hundreds/thousands of similar suits
- Force redesigns (limited autoplay, stricter age gates, less aggressive algorithms for minors)
- Accelerate global regulation (already building in EU, UK, some U.S. states)
- Challenge Section 230 protections in new ways (focus on product design vs. content moderation)
A defense win? It strengthens Big Tech’s argument that platforms aren’t liable for user choices or downstream harms—potentially slowing the wave.
Either way, this trial marks a generational pivot: from “user beware” to “platform accountable.”
The Deeper Battle: Attention Economy on Trial
Social media’s core business model—maximize time-on-site → maximize ads—is under direct scrutiny.
If courts decide engineered addiction equals negligence:
- Infinite features could become regulated (or optional/opt-out)
- Algorithms may need “health” overrides for vulnerable users
- Profit incentives shift from pure engagement to balanced well-being
This isn’t just about one young woman’s pain—it’s about whether the internet’s dominant economic engine can survive legal limits on addictiveness.
What Happens Next—and What It Means for All of Us
Deliberations resume soon. A verdict could land any day.
Win or lose, the precedent will echo:
- For parents: Stronger tools/cultural pressure to limit exposure
- For teens: Potentially safer defaults
- For tech: Billions in liability risk + forced product changes
- For society: A reckoning with how much “free” platforms really cost our mental health
The internet gave us connection, information, creativity. But if staying isn’t always voluntary—if it’s engineered—what price do we pay?
This Los Angeles courtroom may finally force an answer.
And once answered, the internet we scroll tomorrow might look very different from the one we scrolled yesterday.
Final Thought: The Internet at a Crossroads
For decades, tech companies operated under a simple assumption:
👉 If users stay, the product works.
But now, that assumption is being challenged.
What if staying isn’t always a choice?
What if it’s the result of design?
This trial may answer that question—and in doing so, decide the future of social media itself.
Because if platforms can be held accountable for addiction…
👉 The internet will never be the same again.
Recommended for you:
