For years, social media companies have insisted their platforms are neutral tools. What users do with them, they argued, is not the companies’ responsibility. On 25 March 2026, a jury in Los Angeles disagreed, and the implications stretch far beyond one courtroom.
A Los Angeles County Superior Court jury found Meta and YouTube negligent in the design and operation of their social media platforms, ruling that their negligence was a substantial factor in causing harm to the plaintiff, a 20-year-old woman identified only by her initials KGM. The jury awarded $3 million in compensatory damages, placing 70% of the liability on Meta and 30% on YouTube. Jurors later awarded an additional $3 million in punitive damages, bringing the total judgment to $6 million.
The money is almost beside the point. For companies each worth trillions of dollars, $6 million is negligible. What matters is what the verdict represents.
What The Case Was Actually About
This was not a case about harmful content. It was about harmful design.
For the first time, an American jury was asked to decide whether platform design itself can give rise to product liability, not because of what users post on them, but because of how they were built.
KGM testified that she began using YouTube at age six and created an Instagram account at age nine. She allegedly developed compulsive use patterns, including up to 16 hours in a single day on Instagram. The platforms’ design features, she argued, contributed to her anxiety, depression, body dysmorphia, and suicidal ideation.
Her lawyers argued that the platforms’ specific design choices; infinite scroll, autoplay, algorithmic recommendation engines, and deliberately unpredictable rewards, were not accidents. They were engineering decisions made with full knowledge of their psychological effects on young users.
Internal documents from Meta shown to the jury included one in which CEO Mark Zuckerberg and other executives described efforts to attract and keep children and teenagers on the platform. One document read: “If we wanna win big with teens, we must bring them in as tweens.” Another internal memo showed that 11-year-olds were four times as likely to keep coming back to Instagram compared with competing apps.
The jury found that this internal awareness constituted the kind of corporate knowledge that supports liability.
Why This Is Being Called Big Tech’s Big Tobacco Moment
“This verdict is significant because it ruled that social media apps are addictive and deliberately designed that way,” Rosalind Gill, a professor at Goldsmiths University in London, told Al Jazeera. “Many experts are already calling the judgement Big Tech’s ‘big tobacco moment’ — the moment at which the tobacco industry had to accept not only that their product was harmful, but also that they had known this and had tried to cover it up.”
The tobacco comparison is not casual. In the 1990s, a wave of US lawsuits forced the tobacco industry to pay billions in damages, fund public health campaigns, and fundamentally change how it marketed its products. The legal strategy in this case was modelled directly on that playbook.
The verdict could influence the outcome of more than 2,000 other pending lawsuits against social media companies. KGM’s case was a bellwether trial, a test case chosen specifically to gauge how juries respond and set a legal precedent for the thousands that follow. Both TikTok and Snap settled before the trial began. Meta and Google have announced they plan to appeal.
What It Means For Africa
The verdict happened in Los Angeles. The platforms operate everywhere.
Nigeria has one of the highest rates of social media usage on the continent, with young people making up the overwhelming majority of users. The same infinite scroll, the same autoplay, the same algorithmic systems designed to maximise engagement at the expense of wellbeing are running on Nigerian phones, in Nigerian schools, affecting Nigerian teenagers.
Africa currently has no equivalent legal framework to bring this kind of case. Section 230, the US law that historically shielded social media companies from liability, does not apply here, but neither do the consumer protection mechanisms that made this lawsuit possible. African regulators have been largely absent from the conversation about platform design accountability.
That may change. Legal experts believe the decision will likely generate a powerful domino effect across jurisdictions worldwide. As this verdict sets precedent in the US and potentially triggers similar cases in Europe, pressure on African governments to develop their own regulatory frameworks for platform design will grow.
For now, the platforms will continue operating exactly as they were designed to. The jury’s verdict does not change the algorithm on your phone today. But it establishes something that did not exist before Wednesday, the legal principle that how a platform is built is a product decision, and product decisions carry consequences.
Every Nigerian teenager on Instagram right now is using a platform a jury just found was engineered to keep them there.












