They Knew

Section 230 held. The jury ruled against Meta and YouTube anyway. A Los Angeles jury found both companies liable for malice, oppression, and fraud — triggering punitive damages calculated against $632 billion in combined stockholder equity. This is the tobacco moment for Big Tech.

· 8 min read · Episode 8
metayoutubesocial-mediachildrenbig-techsection-230litigationproduct-designaddiction
ShareXLinkedInFacebook
Compensatory Damages
$3M
New Mexico Penalty
$375M
Punitive Phase Basis
$632B
Pending Lawsuits
~2,000

A Los Angeles jury today found Meta and YouTube liable for the mental health harm suffered by a 20-year-old woman who began using Instagram at age 11 and YouTube at age 6. Compensatory damages: $3 million. Meta responsible for 70%, YouTube 30%. The jury also found both companies acted with "malice, oppression, or fraud" — triggering a separate punitive damages phase.

The judge confirmed Section 230 protections throughout the trial. Jurors were explicitly told not to consider the content Kaley saw on the platforms. The companies were not on trial for what their users posted. They were on trial for how their products were designed — and for what their own internal research told them about those designs.

Tobacco companies were not destroyed by government regulation. They were destroyed when their own internal research became public. The research showed they knew nicotine was addictive, knew their products caused cancer, and kept selling anyway. Today's verdict follows the same logic. Meta and YouTube ran the research. They found the harm. They kept optimizing for engagement.


The Brief

  • LA jury found Meta and YouTube liable on all counts — negligence, failure to warn, and malice. Compensatory damages $3M. Punitive damages phase now begins, with the jury able to reference Google's $415B and Meta's $217B in stockholder equity as the calculation basis. Both companies plan to appeal. ✓ CBS News · Mar 25, Fox Business · Mar 25

  • Section 230 did not protect them — because this was never a content case. Plaintiff's attorneys targeted product design features: infinite scroll, autoplay, algorithmic push notifications. The legal strategy: companies cannot hide behind content immunity when what they manufactured was the addiction mechanism itself. ✓ CNN · Mar 25, NPR · Mar 25

  • The internal research is the verdict's foundation. Evidence showed Meta understood how addictive its platforms were among teenagers, was actively researching the issue, and used its findings to increase engagement — not to protect young users. Jurors found Meta engaged in "unconscionable" trade practices that exploited the vulnerabilities of children. ✓ TechCrunch · Mar 25, WUFT · Mar 25

  • This is a bellwether trial. The court chose this case to help determine outcomes for approximately 2,000 connected lawsuits. A federal trial covering consolidated nationwide claims begins this summer. The $3M is the opening bid, not the final number. ✓ CNBC · Mar 25, ABC News · Mar 25


What Section 230 Does — and Does Not — Cover

Section 230 of the 1996 Communications Decency Act gives tech platforms immunity from liability for content posted by users. That protection held throughout this trial. Jurors were specifically instructed to ignore the content Kaley encountered on Instagram and YouTube.

What was on the table was the architecture underneath it. Infinite scroll — a feed with no natural stopping point, designed so users never encounter an end. Autoplay — the next video begins before you have decided to watch it. Algorithmic notifications — push alerts timed and calibrated to pull users back at the moment they are most likely to re-engage. These are not content decisions. They are product decisions. Made with full knowledge of what they do to the developing brain of a child.

The plaintiff's legal strategy was precise: do not fight Section 230, go around it. The product was the weapon, not the content. The jury agreed.


The Internal Research Is the Story

The most damaging element of this trial was not Kaley's testimony. It was Meta's own documents.

Evidence presented during the seven-week trial showed that Meta had internal research demonstrating Instagram's harmful effects on teenage girls — particularly around body image and mental health. The company knew. It ran the studies. It found the harm. And then it continued optimizing for engagement metrics rather than changing the design features the research identified as dangerous.

This is the tobacco parallel made explicit. Tobacco companies ran internal research on nicotine addiction and lung cancer for decades before those documents became public. The research was not used to make cigarettes safer. It was used to make cigarettes more addictive. When those internal documents came out in discovery, the litigation was not about whether smoking caused cancer. It was about whether the companies knew and concealed it.

"For years, social media companies have profited from targeting children while concealing their addictive and dangerous design features. Today's verdict is a referendum — from a jury, to an entire industry — that accountability has arrived."

— Plaintiff's attorneys · Los Angeles Superior Court · March 25, 2026 ✓ NPR · Mar 25

The jury found Meta acted with "malice, oppression, or fraud." That is not a finding of negligence. That is a finding of intent. The distinction matters enormously for the punitive damages phase: negligence produces compensatory awards calibrated to the harm. Fraud produces punitive awards calibrated to the wrongdoer's capacity to pay — $217 billion for Meta and $415 billion for Google.


YouTube's Defense — and Why It Did Not Work

Google's spokesperson said after the verdict: "This case misunderstands YouTube, which is a responsibly built streaming platform, not a social media site."

The argument has surface logic. YouTube is different from Instagram. You watch content rather than scroll through a social graph. A reasonable person might watch YouTube for hours, as they might watch television, without it constituting addiction.

The problem is YouTube Shorts. YouTube Shorts is an infinite-scroll vertical video feed algorithmically personalized to each user — indistinguishable in design from TikTok or Instagram Reels. It was a deliberate competitive response to those products, built with full knowledge of what infinite-scroll personalized feeds do to engagement, and to users.

You cannot argue that you are a television platform and simultaneously run a product designed to replicate the most addictive features of social media. The jury awarded YouTube 30% of the liability — not innocent, less culpable.


New Mexico: The Other Verdict, The Other Theory

The Los Angeles verdict did not happen in isolation. The day before, a New Mexico jury ordered Meta to pay $375 million — a separate case, a separate legal theory, the same underlying conclusion.

New Mexico's case was brought by the state Attorney General. The theory: not design defect, but failure to protect children from sexual predators on the platform. The AG alleged Meta knew its platforms were being used to target and exploit children, had internal data documenting the scale of the problem, and concealed it while publicly claiming to prioritize child safety.

Two cases. Two states. Two legal theories. One finding: Meta ran the research, found the harm, and did not tell the people who needed to know.

"Juries in New Mexico and California have recognized that Meta's public deception and design features are putting children in harm's way."

— New Mexico Attorney General Raúl Torrez · March 25, 2026 ✓ WUFT · Mar 25

Two simultaneous verdicts are not additive — they are multiplicative. A single verdict can be dismissed as a rogue jury. Two verdicts in two jurisdictions under two different legal theories, within 24 hours, is a pattern. This is what the beginning of the tobacco reckoning looked like — not one case, but a cascade.


What Happens Next

The punitive damages phase is the real trial. The jury has found fraud. Legal experts note that punitive damages in fraud cases typically run significantly higher than compensatory awards — sometimes by multiples of ten or more. The jury can reference Google's $415B and Meta's $217B in stockholder equity. The $3M is not a ceiling. It is a floor.

The 2,000 pending cases now have a template. Every family and school district with a pending suit just received a verdict that says: a California jury, after seven weeks of evidence, found these companies committed fraud against children. The subsequent litigation will not relitigate the core question. It will argue about damages.

Congress has a new pressure point. The political environment has resisted federal online safety legislation — Section 230 has powerful defenders and Big Tech lobbying has been effective. But a jury verdict finding fraud is different political ammunition. The Master Settlement Agreement of 1998 — $206 billion paid to 46 states by tobacco companies — began with exactly this kind of verdict. The question is whether Congress acts before the litigation compels a settlement at scale.


The Read

This is not a story about government regulating the internet. It is a story about the market's accountability mechanism activating — late, imperfectly, but activating.

The free market's first requirement is that consumers can make informed choices. When a company hides its own safety research from the parents of children using its products, it removes the informed part of informed consent. That is not innovation. That is fraud. Parents cannot make good decisions about their children's Instagram use if Meta has buried the internal study showing Instagram damages teenage girls' mental health. The jury did not punish Meta for building a product people enjoy. It punished Meta for concealing what it knew about who that product was harming.

The tobacco parallel is structural, not rhetorical. Tobacco companies were not broken by regulators — they were broken by discovery. Their own documents, compelled in litigation, showed they had run the research, found the harm, and suppressed it. The Master Settlement Agreement came not from legislation but from juries in cases exactly like this one. Big Tech is now at the same inflection point.

The free-market insight: $3 million is the smallest number in this story. The punitive damages phase begins with $632 billion in combined stockholder equity on the table. The 2,000 pending cases have a fraud finding to build on — and they do not need to prove content harm, only design harm. Meta and YouTube did not lose because they built products children love. They lost because they ran the research, found the harm, and kept optimizing anyway. That is not a Silicon Valley innovation story. That is what accountability looks like when the market finally gets the information it was denied. ~ Framework


Market Truths covers finance, markets, and geopolitics three times weekly — Tuesday, Thursday, and Saturday. Available on GanjingWorld, Medium, and Substack. Originally published at markettruthspod.com.

Source Index

~ Framework
Fox Business / AEI2026-03-25
www.foxbusiness.com

Market Truths covers finance, markets, and geopolitics three times weekly. Available on GanjingWorld — a platform dedicated to positive, family-safe content, guided by the philosophy Technology for Humanity — as well as Spotify, Apple Podcasts, and YouTube.