www.optimistdaily.com
A $375 million verdict that could reshape how Big Tech treats children
BY THE OPTIMIST DAILY EDITORIAL TEAM
A New Mexico jury ruled last Tuesday that Meta knowingly harmed children’s mental health, made false or misleading statements about platform safety, and engaged in trade practices the jury called “unconscionable.” The trial ran nearly seven weeks. The verdict caps the first case of its kind to reach a jury in the United States.
The jury found thousands of individual violations and set a penalty of $375 million, less than a fifth of what prosecutors sought. Meta is valued at roughly $1.5 trillion. Its stock rose five percent in after-hours trading after the verdict came in.
What the jury actually established
The dollar amount is almost beside the point for a company of Meta’s size. What the evidence established over seven weeks in court is harder to brush off.
New Mexico prosecutors built their case around Meta’s own internal documents, testimony from company executives and whistleblowers, and an undercover investigation where state agents created social media accounts posing as children to document sexual solicitations and how Meta responded. Jurors also heard from psychiatric experts and local educators describing sextortion schemes targeting students.
The jury reviewed specific statements made by Meta CEO Mark Zuckerberg, Instagram head Adam Mosseri, and Meta’s global head of safety, Antigone Davis. It also examined the company’s failure to enforce its ban on users under 13, the role of its algorithms in pushing harmful content, and the prevalence of material about teen suicide on its platforms.
“We know the output is meant to be engagement and time spent for kids,” prosecution attorney Linda Singer told jurors. “That choice that Meta made has profound negative impacts on kids.”
Juror Linda Payton, 38, said the panel compromised on the estimated number of teenagers affected but chose the maximum $5,000 penalty per violation. She said she believed each child was worth the maximum amount.
What this does to Section 230
For 30 years, tech companies have sheltered behind Section 230 of the Communications Decency Act, which protects platforms from liability for user-posted content. New Mexico prosecutors argued that protection does not cover Meta’s own algorithmic choices: the systems the company built to decide what to show users and when. A jury agreed.
“Meta’s house of cards is beginning to fall,” said Sacha Haworth, executive director of watchdog group The Tech Oversight Project. “For years, it’s been glaringly obvious that Meta has failed to stop sexual predators from turning online interactions into real world harm.” Haworth pointed to whistleblowers, including Arturo Béjar, and unsealed documents as evidence.
Meta’s legal team pushed back. “Evidence shows not only that Meta invests in safety because it’s the right thing to do but because it is good for business,” Meta attorney Kevin Huff told jurors. “Meta designs its apps to help people connect with friends and family, not to try to connect predators.” A company spokesperson said Meta disagrees with the verdict and will appeal, adding: “We work hard to keep people safe on our platforms and are clear about the challenges of identifying and removing bad actors or harmful content.”
What changes, and when
The verdict does not force Meta to change anything yet. A second trial phase in May will determine whether the company’s platforms created a public nuisance and whether Meta should fund public programs to address the documented harms.
New Mexico’s case was among the first in a much larger legal wave. More than 40 state attorneys general have filed suits against Meta over child mental health.
For the parents who pushed for this outcome, the number on the verdict form was not the main point. ParentsSOS, a coalition of families who have lost children to social media-related harm, called it “a watershed moment.” “We parents who have experienced the unimaginable — the death of a child because of social media harms — applaud this rare and momentous milestone in the years-long fight to hold Big Tech accountable for the dangers their products pose to our kids,” the group said. The question now is whether the May trial phase translates that accountability into something that actually reaches the platforms.
Did this solution stand out? Share it with a friend or support our mission by becoming an Emissary.The post A $375 million verdict that could reshape how Big Tech treats children first appeared on The Optimist Daily: Making Solutions the News.