For more than a decade, child victims and their parents have been denied the opportunity to seek justice for the harm they have suffered from the products of social media – ranging from anxiety, depression, eating disorders, substance abuse, suicide, extortion and, in the worst cases, death. Starting this week, they will finally have their day in court.
The first trial of social media companies began in Los Angeles on Tuesday, serving as the first test case for many more ongoing lawsuits. Meta, TikTok, Snap and YouTube are facing more than 3,000 lawsuits in California alone, along with more than 2,000 additional cases in federal court.
Surprising evidence is already coming to light. One internal Meta employee messaging compares Instagram to drugs and slot machines. “Oh my god, IG is a drug,” “Lol, I mean, all social media. We’re basically pushers.”
FRENCH LEGISLATORS DECLARES ‘BATTLE FOR FREE MINDS’ AFTER APPROVING THE BAN ON SOCIAL MEDIA FOR CHILDREN UNDER 15 YEARS OF AGE
This is a landmark case. Never before have we been able to see this evidence and reach this stage of litigation through social media, where courts and the public can finally see for themselves the choices these companies have made when it comes to minors.
For years, all lawsuits against social media companies were dismissed outright because of a 1996 law called Section 230, which gives internet platforms immunity from harm caused by third-party content they host. But this new wave of cases takes a new approach: instead of claiming that the harm to victims was caused by the content the victims were exposed to, they claim that the harm was caused by the companies’ own product design features.
These lawsuits don’t blame poor content or “too much screen time,” debunking the idea that parents are to blame for letting kids spend too much time online. Instead, they claim that the defendants designed their platforms to be addictive and failed to warn users of their addictive potential.
TEXAS FAMILY SUES CHARACTER.AI AFTER CHATBOT ENCOURAGES AUTISTIC SON TO HARM PARENTS AND HIMSELF
The features designed to promote addiction that are in question include infinite scrolling, autoplay, recommendation algorithms that send minors down rabbit holes, push notifications and “likes,” all of which create addictive, dopamine-driven feedback loops to keep a user engaged for as long as possible.
Parents are fighting back because Washington didn’t. Congress hasn’t passed a law addressing children’s online safety since 1998, nearly a decade before social media even existed.
As with the massive lawsuits against tobacco companies in the late 1990s and against opioid companies more recently, the key question the jury must answer in this social media lawsuit is simple: Did these companies negligently design and market a highly addictive product to children and did they know—and fail to warn users—that their products were harming minors?
Critics of the lawsuits on social media argue that these cases do not belong in court. They argue that it is too difficult for victims to prove the cause of their harm from social media, given the complex interplay between personal experiences, personality and online exposure.
AUSTRALIA REMOVES 4.7M CHILDREN FROM SOCIAL MEDIA PLATFORMS IN FIRST MONTH OF HISTORIC BAN
However, similar arguments have been made in the past against filing lawsuits against Big Tobacco or opioid manufacturers. Critics argued that people become addicted for all kinds of reasons and that companies are therefore not to blame. But we know how those massive lawsuits turned out: multibillion-dollar settlements from Big Tobacco and the pharmaceutical companies against hundreds of thousands of plaintiffs harmed by their products. These social media suits seem to be following the same path.
The evidence speaks for itself. New unsealed documents provide compelling evidence that Meta, Google, Snap and TikTok all purposefully designed their social media products for addicted children and teens, and that youth addiction was an intentional part of their business models.
The documents include internal discussions between company employees, presentations from internal meetings and their own investigations. One exhibition from an intern report from Meta states that “the lifetime value of a 13 year old teen is approximately $270 per teen.” Another Meta report says “the young are the best” when explaining how young users have greater long-term retention for the company when using their products.
CLICK HERE FOR MORE FOX NEWS ADVICE
These companies quantified our children and their attention to maximize their profit value, all while knowing their products were harming underage users. Results of a Meta-internal study on teen mental health found that “Teens can’t turn off Instagram even if they want to” and that “Teens talk about Instagram in terms of an ‘addict’s story’ and spend too much time engaging in compulsive behaviors that they know are negative but feel powerless to resist.”
Meta and these other platforms allowed our children to be harmed and said nothing. Now the public will finally know.
Unsurprisingly, two of the four companies in this first lawsuit, Snap and TikTok, both settled before the proceedings began. These companies don’t want damning internal evidence to come to light that they knew their products were harmful to children, and did nothing to change the design or warn users. Nevertheless, parents and teens will finally have their day in court.
CLICK HERE TO DOWNLOAD THE FOX NEWS APP
Parents are fighting back because Washington didn’t. Congress hasn’t passed a law addressing children’s online safety since 1998, nearly a decade before social media even existed. While Australia has recently done that past a ban on social media for minors under the age of 16, while France and Britain propose the same. In the United States, it is parents and states who are stepping into the void to hold social media companies accountable through the courts. If Congress doesn’t do it, parents will.
This trial is the tobacco moment for Big Tech.
CLICK HERE TO CLARE MORELL


