Wednesday was a historic day when Mark Zuckerberg took the stand and had to answer under oath before a jury to charges that Meta knowingly designed and promoted products that addicted young users – including children – despite internal warnings about the risks. This was the first time he had testified before a jury in such a case.
While Zuckerberg’s testimony has often been characterized by sidestepping and dodging questions — to the point where the judge ordered him to answer directly — he can’t back away from this. The evidence in this social media trial speaks for itself.
The plaintiff’s attorney, Mark Lanier, focused his questioning on three central themes: 1) addictive users; 2) allow underage users to access the platform; and 3) making business decisions that prioritize profit over safety.
Zuckerberg received an email from 2015 in which the CEO stated that his goal for 2016 was to increase the time users spent on the platform by 12%. Zuckerberg argued that Meta’s growth targets reflect a goal to give users something useful, not to make them addicted, stating that the company is not trying to attract children as users.
NEARLY A SECOND THIRD OF US VOTERS RESPOND TO SOCIAL MEDIA BAN FOR CHILDREN UNDER 16, FOX NEWS POLL SHOWS
When asked if he believes people tend to use more when it’s addictive, he dismissed the premise. “I don’t think that applies here,” he said.
But it absolutely applies. Meta’s entire business model is based on user engagement. Social media may seem “free,” but a child’s time, attention, and data are the product being sold. More hours with eyes on the screen means more advertisements to be sold. The user is the product. The incentive is to keep users involved as much as possible.
As confirmed earlier in the trial by addiction expert Dr. Anna Lembke of Stanford University, social media meets the clinical criteria for addiction, according to her expert testimony.
AFTER AUSTRALIA APPROVED THE SOCIAL MEDIA BAN, INVESTIGATES WHY CONGRESS HAS NOT DONE MORE TO PROTECT CHILDREN
Lanier also questioned Zuckerberg at length about Meta’s age verification policy. He showed an internal Meta email from 2015 that estimated that four million children under the age of thirteen were using Instagram – about 30% of American children aged ten to twelve. One in three young teenagers.
Zuckerberg said the company removes identified underage users and includes conditions about age requirements during the sign-up process. Lanier responded, “Do you expect a nine-year-old to read all the fine print? Is that your basis for swearing under oath that children under the age of thirteen are not allowed?’
Zuckerberg added that some children “lie about their age to use the services.” During this conversation, he also said, “I don’t understand why this is so complicated… we have rules, and people broadly understand that.”
AI COMPANIES REFORM TEENAGERS’ EMOTIONAL ASSOCIATIONS
Waving his hand and saying “we have rules” is not an adequate defense. These are minors. It is the company’s responsibility to ensure that the platform is effectively age-restricted; otherwise the stated age policy is meaningless.
In practice, age verification on most social media platforms relies largely on self-reported dates of birth. A child can enter a false age, click to accept the terms and conditions, and gain access in minutes. Critics argue that without meaningful safeguards, age restrictions are little more than an honor system.
The access age is an important topic in this trial. The plaintiff, KGM, who joined Instagram at age 9, claims her social media use as a child and teenager led to body dysmorphia, suicidal ideation, anxiety, addiction and depression. Her age at which she began using the app – during a period of significant brain development between ages 10 and 12 – is crucial to the harm she claims.
AUSTRALIA REMOVES 4.7M CHILDREN FROM SOCIAL MEDIA PLATFORMS IN FIRST MONTH OF HISTORIC BAN
Instagram should never have allowed her on the platform at the age of nine, the prosecutor argues. Whether the jury ultimately agrees remains to be seen, but the case places the responsibility for those decisions squarely on Meta’s leadership.
Lanier ended his questioning by — with the help of six others — rolling out a 50-foot collage of every selfie KGM posted to Instagram, many with beauty filters. He asked Zuckerberg if Meta had ever investigated her story about unhealthy behavior. Zuckerberg did not respond.
Meta CEO Mark Zuckerberg on the stand of the Los Angeles Superior Court in Los Angeles, California, USA, February 18, 2026. Zuckerberg faces a jury in the landmark trial alleging that social media platforms intentionally addict and harm children. (Mona Edwards)
Previously, Lanier pressed Zuckerberg over his decision to allow beauty filters that mimic plastic surgery, after 18 internal experts warned they are harmful to teenage girls and could contribute to body dysmorphia, internal documents show. Zuckerberg and Adam Mosseri, head of Instagram, eventually overturned a temporary ban and allowed the filters on the platform. Plaintiffs allege that this decision exposed vulnerable young users to aids related to body dysmorphia and other mental health issues.
Zuckerberg defended the decision, saying that Instagram did not create its own filters or recommend them to users after the ban was lifted. He added: “I think it’s often an exaggeration to tell people they can’t express themselves that way.”
CLICK HERE FOR MORE FOX NEWS ADVICE
What a twist. Removing plastic surgery filters that harm young girls is, in his words, “presumptuous.” Many parents call it taking reasonable precautions.
While Zuckerberg has publicly said Meta cares about children’s safety — telling Congress in 2024 that “our job is to make sure we develop tools to keep people safe” and that “we stand with parents everywhere who are working hard to raise their children” — the internal evidence presented at trial suggests otherwise.
CLICK HERE TO DOWNLOAD THE FOX NEWS APP
Although he did not want to admit in court that he knew his products were addictive or targeted teenagers, he did not have to. The jury – and the public – can weigh their answers against the internal documents and decide for themselves.
CLICK HERE TO CLARE MORELL


