The tide is finally turning on social media. In the past two days, two separate juries, in New Mexico and California, have held social media companies responsible for harming children for the first time in our history. That’s right, never before has a court in the United States held social media platforms liable for the harm they cause to children. This is a historic moment.
The verdict from the Los Angeles trial is notable because this case was brought using tort law to hold social media companies liable for the mental health harm suffered by an individual plaintiff, referred to in the lawsuit as “Kaley.” The jury found Meta and YouTube guilty of negligence in designing and operating an addictive product that was harmful to children and for failing to warn users of the harm. Never before has a case like this even led to a trial, let alone a guilty verdict.
For more than a decade, child, teen, and parent victims have suffered countless harms from social media, including suicide, self-harm, eating disorders, anxiety, and depression, but have been unable to get justice. If another product, such as defective toys or poisonous food, harmed children, the parents would already have had their day in court.
This does not apply to social media. Tech companies have hidden behind the massive immunity shield of a law called Section 230, which says online platforms are not liable for damages from third-party speeches they host. That’s why all the lawsuits filed against platforms for years over social media harm to children were dismissed outright because of Section 230.
GREGG JARRETT: JURY ACCUSES META AND GOOGLE OF TEEN HARM, BUT APPEAL COULD DESTROY THE CASE
Family members of victims spoke to reporters outside Los Angeles Superior Court on March 25 in Los Angeles after a jury found Meta and YouTube negligent in a lawsuit alleging their platforms contributed to harmful behavior among young users. (Kayla Bartkowski/Los Angeles Times via Getty Images)
Not anymore. This tort case took a new legal approach and focused solely on social media product design because it is addictive and harmful to children – recommendation algorithms, likes, autoplay, infinite scrolling and notifications – regardless of the content.
Their strategy paid off. The jury saw the evidence for what it is. For example, when Facebook co-founder and CEO Mark Zuckerberg took the stand at the trial, he was asked about his decision to allow beauty filters that mimicked plastic surgery on Instagram after 18 of Meta’s internal experts warned that they were harmful to teenage girls and could contribute to body dysmorphia.
He tried to brush it off, saying, “I think it’s often presumptuous to tell people they can’t express themselves that way.”
META PROMISES TO FIGHT ‘AGGRESSIVELY’ AFTER SIGNIFICANT STATEMENTS FINDS TECH GIANT LIABLE FOR ADDICTIVE CHILDREN
They saw the internal emails and presentations that said, “the little ones are the best,” “oh my god, IG is a drug,” or “we’re basically pushers.” The jury could clearly see that these platforms were designed to be addictive, that these companies knowingly harmed children, and that they failed to warn users. As Mark Lanier, the lead attorney for the plaintiff, said at a news conference after the ruling: “We have sent a message that you will be held accountable simply because of the characteristics that encourage addiction.”
Thousands of other cases are currently waiting in the wings to go to trial, 3,000 in the state of California alone, and with this positive initial outcome, companies will be incentivized to settle these other cases rather than go to trial again. Meta and YouTube and other platforms named in the ongoing lawsuits, such as TikTok and Snapchat, should all prepare to pay up. Take the $6 million in damages awarded in this one verdict and multiply it by thousands. This is the Big Tobacco moment for Big Tech.
Big Tech’s allies and sympathizers are trying to argue that this ruling diminishes parents’ responsibility to raise healthy children. They quote the accuser, as FIRE Executive Vice President Nico Perrino tweeted: “Kaley says she started using YouTube at age 6 and Instagram at age 9 and told jury she was on social media ‘all day’ as a child.” He added, “Where were the parents?”
NEARLY A SECOND THIRD OF US VOTERS RESPOND TO SOCIAL MEDIA BAN FOR CHILDREN UNDER 16, FOX NEWS POLL SHOWS
They ask the wrong question. The problem is not absent parents, but addictive products without meaningful parental controls or robust age verification. As I explain in my book, “The technical exit“Social media platforms actively bypass parents to reach their children – they recruit young users and as was evident in this trial in LA, they do not have an effective age limit on their platforms nor do they require parental consent.
So the best outcome for these pending cases is not simply massive payouts to victims, but a restructuring of the way social media companies do business. One of the most significant pending lawsuits, a multi-district lawsuit by 40 attorneys general that goes to trial this summer, could do just that.
CLICK HERE FOR MORE FOX NEWS ADVICE
In 1998, 52 state and territory attorneys general signed the Master Settlement Agreement (MSA) with the four largest tobacco companies in the U.S. to settle dozens of lawsuits filed to recover billions of dollars in health care costs associated with the treatment of smoking-related diseases.
Thousands of other cases are currently waiting in the wings to go to trial, 3,000 in the state of California alone, and with this positive initial outcome, companies will be incentivized to settle these other cases rather than go to trial again.
That agreement changed the industry forever, banning tobacco from targeting young people in its advertising, banning the use of cartoons (which appeal to children) in advertising or packaging, banning payments to promote tobacco in media such as movies, TV, music and video games, providing money to the states to fund smoking prevention campaigns and more.
CLICK HERE TO DOWNLOAD THE FOX NEWS APP
As part of a potential settlement deal, attorneys general could similarly require robust age verification measures to keep out underage minors, require parental consent for social media accounts, or even force the platforms to voluntarily raise the age for their accounts to 16 from 13.
A settlement agreement could also require companies to disable certain addictive features for minors under a certain age, such as recommendation algorithms, infinite scrolling, autoplay, likes or other features. Social media doesn’t have to be addictive. This first positive ruling is significant because it suggests that the pending lawsuits in multiple districts could lead to a massive settlement like Big Tobacco that will change the social media industry forever.
CLICK HERE TO CLARE MORELL


