By Clare Morell
Published January 30, 2026
For over a decade, child victims and their parents have been denied the chance to get justice for the harms they have suffered from social media’s products–ranging from anxiety, depression, eating disorders, substance use disorders, suicide, sextortion, and, in the starkest cases, death. Starting this week, they will finally have their day in court.
On Tuesday, the first bellwether trial against social media companies began in Los Angeles, serving as the initial test case for many more pending lawsuits. Meta, TikTok, Snap and YouTube face more than 3,000 lawsuits in California alone, along with more than 2,000 additional cases in federal court.
Startling evidence is already coming to light. One internal Meta employee message exchange compares Instagram to drugs and slot machines. "Oh my gosh yall IG is a drug," "Lol, I mean, all social media. We’re basically pushers."
This is a landmark case. Never before have we been able to see this evidence, to reach this stage of litigation with social media where courts, and the public, can finally see for themselves the choices these companies have made when it comes to minors.
For years, any lawsuits brought against social media companies were dismissed outright because of a law from 1996 called Section 230 that gives internet platforms immunity for harm caused by third-party content they host. But this new wave of cases takes a novel approach: Rather than alleging that harm to the victims was caused by the content victims were exposed to, they argue that harm was caused by the product design features of the companies themselves.
These lawsuits do not blame bad content or "too much screen time," dismantling the idea that parents are to blame for allowing kids to be online too much. Instead, they claim the defendants engineered their platforms to be addictive and failed to warn users about their addictive potential.
The features designed to foster addiction that are at issue include infinite scroll, autoplay, recommendation algorithms that send minors down rabbit holes, push notifications and "likes," all of which create addictive dopamine-driven feedback loops to keep a user engaged for as long as possible.
Parents are fighting back because Washington didn’t. Congress hasn’t passed a child online safety law since 1998, nearly a decade before social media even existed.
As with the massive litigation against tobacco companies in the late 1990s and opioid manufacturers more recently, the key question for the jury to decide in this social media trial is straighforward: Did these companies negligently design and market a highly addictive product to children and did they know — and fail to warn users—-that their products cause harms to minors?
Critics of the social media lawsuits argue that these cases don’t belong in court – claiming that it is too difficult for victims to prove causation of their harms from social media, given the complex interplay of personal experience, personality and online exposure.
AUSTRALIA REMOVES 4.7M KIDS FROM SOCIAL MEDIA PLATFORMS IN FIRST MONTH OF HISTORIC BAN
Similar arguments, however, were also made in the past against bringing lawsuits against Big Tobacco or opioid producers. Critics argued that people fall into addiction for all kinds of reasons, and therefore companies weren’t to blame. But we know how those massive litigations turned out: multibillion-dollar settlements from Big Tobacco and the pharma companies to hundreds of thousands of plaintiffs who were harmed by their products. These social media suits appear to be on that same path.
The evidence speaks for itself. Newly unsealed documents provide smoking-gun evidence that Meta, Google, Snap and TikTok all purposefully designed their social media products to addict children and teens and that youth addiction was an intentional part of their business models.
The documents include internal discussions among company employees, presentations from internal meetings, and their own research studies. One exhibit of an internal report from Meta states that "the lifetime value of a 13 y/o teen is roughly $270 per teen." Another Meta report says, "the young ones are the best ones" in explaining how young users have greater long-term retention for the company in using their products.
CLICK HERE FOR MORE FOX NEWS OPINION
These companies quantified our children and their attention to maximize their profit value, all while knowing their products were harming minor users. Results of a Meta internal study on teen mental health found that "Teens can’t switch off from Instagram even if they want to" and that "Teens talk of Instagram in terms of an ‘addict’s narrative’ spending too much time indulging in a compulsive behavior that they know is negative but feel powerless to resist."
Meta and these other platforms allowed our children to be harmed and said nothing. Now the public will finally know.
Unsurprisingly, two of the four companies in this first trial, Snap and TikTok, both settled before the proceedings began. These companies don’t want damning internal evidence that they knew their products were harming children, and they did nothing to change the design or warn users, to come to light. Nevertheless, parents and teens will finally have their day in court.
CLICK HERE TO DOWNLOAD THE FOX NEWS APP
Parents are fighting back because Washington didn’t. Congress hasn’t passed a child online safety law since 1998, nearly a decade before social media even existed. Whereas Australia has recently passed a social media ban for minors under 16, with France and the UK proposing the same. In the United States, parents and states are the ones stepping in to the void to hold the social media companies accountable through the courts. If Congress won't do it, parents will.
This trial is the tobacco moment for Big Tech.
CLICK HERE TO READ MORE FROM CLARE MORELL
https://www.foxnews.com/opinion/big-techs-tobacco-moment-here-truth-about-harming-kids-out