For years, parents, teenagers, pediatricians, educators and whistleblowers have pushed the idea that social media is detrimental to young people's mental health and can lead to addiction, eating disorders, sexual exploitation and suicide.
For the first time, juries in two states took their side.
In Los Angeles on Wednesday, a jury found both for harms to children using their services. In a jury determined that Meta knowingly and concealed what it knew about child sexual exploitation on its platforms.
Tech watchdog groups, and children鈥檚 advocates cheered the jury decisions.
鈥淭he era of Big Tech invincibility is over,鈥 said Sacha Haworth, executive director of The Tech Oversight Project. 鈥淎fter years of gaslighting from companies like Google and Meta, new evidence and testimony have pulled back the curtain and validated the harms young people and parents have been telling the world about for years.鈥
While it's too soon to tell if this week's outcomes will lead to fundamental changes in how social media platforms treat their young users, the dual verdicts signal a changing tide of public perception against tech companies that is likely to lead to more lawsuits and regulation. For years, they have argued that the harms their platforms cause to children are a mere byproduct, unintentional and inevitable consequences of broader societal issues or bad actors taking advantage of safeguards. They pushed against the notion that psychological harms could be the result of social media use and downplayed research that showed otherwise.
When asked about whether people tend to use a platform or product more if it鈥檚 addictive during his testimony in the Los Angeles trial, Meta CEO Mark Zuckerberg said 鈥淚鈥檓 not sure what to say to that. I don鈥檛 think that applies here.鈥
The verdicts show the public's growing willingness to hold the companies responsible for harms and demand meaningful changes in how they operate. What's not apparent, at least not yet, is whether the companies will take heed. Both Meta and Google said they disagree with the verdicts and are exploring legal options, including appeals.
Arturo B茅jar, a former Meta engineering director who raised alarms about Instagram's harms inside the company for years before , said jury trials 鈥渓evel the playing field鈥 for these trillion-dollar companies. But he cautioned that it will take actual regulation to rein them in.
鈥淥ne thing that I saw working inside the company that effectively led to behavior change was when an attorney general or the FTC stepped in and required things of the company,鈥 he said. 鈥淏oth New Mexico and Los Angeles and all the attorneys general that are part of this process have really an extraordinary opportunity and the ability to ask for meaningful change.鈥
While both cases focused on harms to children, there are key differences between the two. New Mexico's lawsuit was filed by state Attorney General Ra煤l Torrez in 2023. State investigators built their case by posing as children on social media, then documenting sexual solicitations they received as well as Meta鈥檚 response. The jury was asked to determine if Meta violated New Mexico's consumer protection law.
The Los Angeles case had a single plaintiff, who goes by the initials KGM, against Meta, Google's YouTube, TikTok and Snap. TikTok and Snap settled before trial. The plaintiff in this case argued that the platform design features of the two remaining defendants, Meta and YouTube, were designed to be addictive, especially for young users. Because thousands of families have filed similar lawsuits, KGM and a handful of other plaintiffs have been selected for bellwether trials 鈥 essentially test cases for both sides to see how their arguments play out before a jury, eventually leading to a broader settlement reminiscent of the Big Tobacco and opioid trials.
By focusing on deliberate design choices and product liability, the lawsuits were able to sidestep , which generally exempts internet companies from liability for the material users post on their services. Past lawsuits, which have focused on how the platforms distributed content, often failed on these grounds.
鈥淔or the first time, courts have held social media platforms accountable for how their product design can harm users,鈥 said Nikolas Guggenberger, an assistant professor of law at the University of Houston Law Center. 鈥淭his is a new legal territory that could reshape an industry long shielded by Section 230. Platforms will have to rethink their focus on engagement at any cost, which has outlived itself.鈥
The final outcome of the cases could take years to resolve pending appeals and settlement agreements, but experts say the shift in the public's sentiment and understanding of social media's dangers is already happening. In a 2025 Pew Research Center poll, for instance, 48% of teens said social media harms people their age. In 2022, only 32% said the same.
Amid social media's reckoning, however, artificial intelligence chatbots are emerging as the next frontier in the fight to make technology safer for young people.
鈥淵ou can ban today's harm, but how do you know what tomorrow is going to bring?鈥 said Sarah Kreps, a professor and director of Cornell University鈥檚 Tech Policy Institute. Whether it's another social media app, AI or some other new technology, she added, new things will crop up.
鈥淎nd people will flock to those because where there鈥檚 demand you will see a supply come to meet that demand,鈥 she said.
