Two juries. Two states. $381 million in damages — and 2,000 cases still loading
By Kehinde Adegoke | International Agencies
Los Angeles: In a pair of seismic courtroom verdicts that experts are already comparing to the collapse of Big Tobacco, Meta and YouTube have been found liable for knowingly designing addictive platforms that harmed children — delivering what may be the most consequential legal blow Silicon Valley has ever faced.
A jury in Los Angeles found both Meta and YouTube liable in a first-of-its-kind lawsuit that aimed to hold social media platforms responsible for harm to children, awarding the plaintiff a total of $6 million in damages — $3 million compensatory and $3 million punitive, with Meta bearing 70 per cent of the liability. The verdict landed just one day after a separate but equally damning decision in New Mexico.
Two Verdicts. Two States. One Reckoning.
In New Mexico, a jury found that Meta’s social media platforms harm children’s mental health and violate state consumer protection laws, ordering the company to pay $375 million in civil penalties — the first time any American state has won against a major tech company accused of harming young people. The jury ordered Meta to pay the maximum penalty under the law of $5,000 per violation.
New Mexico Attorney General Raúl Torrez was unsparing in his assessment. “The jury’s verdict is a historic victory for every child and family who has paid the price for Meta’s choice to put profits over kids’ safety,” he said, adding that his office would seek additional financial penalties and court-mandated changes to Meta’s platforms.
The Girl At The Centre Of It All
At the heart of the Los Angeles trial is a 20-year-old woman identified as KGM — known as Kaley during proceedings. She says she began using YouTube at age 6 in 2012 and Instagram at age 11 in 2015, telling the jury she was on social media “all day long” as a child.
KGM testified that her nearly nonstop use of social media caused or contributed to depression, anxiety, and body dysmorphia. “It really affected my self-worth,” she told the court.
Her lawyers, led by attorney Mark Lanier, pointed to specific design features they said were built to hook young users — including infinite scroll feeds, autoplay functions, and notification systems engineered to maximise time on the platform. Lanier argued the platforms deployed a deliberate “Trojan horse” strategy, drawing children in with appealing content before trapping them in addictive loops.
Nine Days. 44 Hours. One Verdict.
Deliberations in Los Angeles wrapped up after nearly 44 hours across nine days. The jury, at one point, told Judge Carolyn B. Kuhl that it was having difficulty reaching a consensus on one defendant before ultimately finding both Meta and YouTube liable on all counts.
Notably, the verdict was not unanimous — only nine of the twelve jurors were required to agree on each claim, and two jurors consistently disagreed with the other ten on the matter of liability. The remaining majority held firm.
The California jury concluded that Meta and Google should pay KGM $3 million in compensatory damages, with Meta on the hook for 70 per cent of that amount. The jury further found that both companies’ conduct warranted punitive damages, awarding an additional $3 million, bringing the total to $6 million. The trial featured testimony from Meta executives, former employees turned whistleblowers, and details from an undercover investigation that led to three arrests.
Zuckerberg In The Dock
The trial carried a remarkable visual dimension. Meta CEO Mark Zuckerberg himself appeared at the Federal Courthouse in downtown Los Angeles to defend the company. His deposition recording was also played for jurors during the New Mexico proceedings.
Meta argued throughout both cases that the plaintiff’s mental health struggles predated or were unconnected to social media use, citing her home life and the fact that none of her therapists had identified social media as the direct cause of her difficulties.
The jury disagreed.
Big Tobacco Moment For Big Tech
Social media companies have historically been shielded by Section 230 of the Communications Act — a provision that protects internet platforms from liability for user-generated content. The Los Angeles case marked the first civil action seeking to hold platforms accountable for alleged addiction and mental health harm.
James Steyer, founder and CEO of Common Sense Media, delivered a damning post-verdict assessment — the companies, he said, buried their own internal research showing children were being harmed, and used young people as unwitting subjects in what he described as massive, uncontrolled, and wildly profitable experiments.
Experts have characterised the trials as social media’s “Big Tobacco moment” — drawing direct comparisons to the 1990s, when tobacco companies were forced to pay billions for concealing the dangers of their products.
The Los Angeles trial was the first bellwether case in a consolidated group of approximately 2,000 pending lawsuits brought against Meta and others by more than 1,600 plaintiffs, including over 350 families and over 250 school districts. TikTok and Snap, originally named as defendants in this specific case, reached undisclosed settlements before the trial began — but remain defendants in a series of similar lawsuits expected to go to trial later this year.
What Comes Next
The verdicts are not final. Meta and Google have both stated they plan to appeal. But the legal and financial exposure now facing Silicon Valley is without precedent.
With punitive proceedings still active in Los Angeles and 2,000 cases loading behind this bellwether verdict, the $381 million combined total across just two concluded cases is almost certainly the floor — not the ceiling.
The Nigerian Angle
For Nigeria, the implications of these U.S. jury decisions are profound. With 13 million TikTok users and millions more on Instagram and YouTube, Nigerian children are exposed daily to the same addictive design features — infinite scroll, autoplay, and dopamine‑driven notifications — that two American juries have now ruled were deliberately engineered to harm young people.
Unlike the U.S., Nigeria has no equivalent legal framework to hold global tech platforms accountable for child safety. Regulators such as the Nigerian Communications Commission (NCC) and the National Information Technology Development Agency (NITDA) have yet to establish binding rules on algorithmic design, platform liability, or children’s digital rights.
This gap leaves Nigerian families vulnerable. Parents face the same struggles with screen addiction, anxiety, and body image issues among children, but without the legal recourse now opening up in the U.S. The verdicts, therefore, serve as a wake‑up call: if Silicon Valley can be held liable abroad, why not here?
Experts warn that Nigeria risks becoming a regulatory laggard unless policymakers act. The trials present an opportunity for lawmakers to begin drafting child‑focused digital safety legislation, aligning with global momentum to rein in Big Tech.
Editor’s Note: An earlier version of this report, published on March 25, 2026, stated total compensatory damages of $3 million. The correct figure, inclusive of punitive damages, is $6 million. The plaintiff’s age when she began using Instagram has also been corrected from age 9 to age 11. This report has been further updated to reflect that the Los Angeles jury verdict was not unanimous — nine of 12 jurors agreed on liability. TheDiggerNews.com is committed to accuracy and transparency in all reporting.
Meta and Google have stated they plan to appeal both verdicts. TheDiggerNews.com will continue to follow developments as proceedings advance.

