A California jury ruled Instagram and YouTube’s design flaws led a teenager into addiction, awarding $6 million in damages—potentially setting the tone for thousands of lawsuits nationwide.
A California jury recently reached a landmark decision. The jury unanimously agreed that social media platforms should bear responsibility for online addiction, finding that Instagram and YouTube were negligent in their platform design and ordered the companies to pay plaintiff KGM $6 million. This ruling could become a key reference point for future lawsuits of a similar type.
The plaintiff in this case is a 20-year-old woman, KGM. In court, she testified that she became addicted to YouTube starting at age 6, and began using Instagram at age 9. The plaintiff’s legal team said these platforms developed multiple design features intended to “attract” young users, including autoplay (Auto-Play), real-time notifications, and an “infinite scrolling” feed that keeps delivering content nonstop.
After more than 40 hours of deliberation, most jurors found that the platforms’ design was negligent in operation, and that both companies knew the platform posed potential danger to minors but failed to provide adequate warnings. The jury ultimately awarded the plaintiff $3 million in compensatory damages. In addition, based on a finding that the platforms acted with malice or fraud, it recommended an extra $3 million in punitive damages (Punitive Damages), bringing the total to $6 million—about NT$190 million.
In response to the verdict, Meta and YouTube both said they disagree and plan to appeal. A Google spokesperson, Jose Castaneda, argued that YouTube should be viewed as a “responsible streaming media platform,” not a social media website. Meta, meanwhile, emphasized that teenagers’ mental health factors are extremely complex and should not be blamed on a single application. In its arguments in court, Meta said the plaintiff’s mental health issues were related to her family environment. However, the jury ultimately found that the plaintiff did not need to prove social media was the only direct cause of her mental health problems—she only needed to prove it was a Substantial Factor (an important factor) that caused harm. The jury also found it hard to accept the inconsistent testimony given by Meta founder Mark Zuckerberg. On responsibility allocation, the jury ruled that Meta bears 70% of the responsibility, with the remaining 30% assigned to YouTube, reflecting an assessment of how different platform interaction mechanisms influence regulatory perspectives.
This case’s proceedings specifically avoided contentious sensitive-content issues. Under Section 230 of the U.S. Communications Decency Act of 1996, technology companies are generally exempt from legal liability for content published by third parties. As a result, the jury was instructed not to consider the specific posts or video content the plaintiff viewed, but instead to focus on the platform’s “structural and design framework” itself.
The plaintiff’s litigation strategy successfully bypassed the legal firewall, treating the addictive nature of social media as a kind of “product defect.” Peter Ormerod, a professor of law and vice provost at Villanova University, said that while this ruling is significant, it is only one of a handful of successful cases among many long-running legal battles. He believes that unless platforms repeatedly lose in court, companies will not make major changes to their existing operating models in the short term.
The impact of this Los Angeles ruling lies in its demonstration effect. Sarah Kreps, director of Cornell University’s Tech Policy Lab, said that currently there are thousands of lawsuits across the U.S. targeting social media addiction, with hundreds in California alone. The defendants in this case include TikTok and Snap, but both reached settlements before trial. With Meta and YouTube as the remaining defendants, the outcome of their case will directly affect the negotiation leverage in thousands of future cases.
Once legal causation between platform design and harm to teenagers is established through a bellwether case, it will prompt more victims to file lawsuits—thereby forcing the tech industry to reevaluate its algorithmic logic for developing new features for minors.