Dawn Chmielewski, Courtney Rozen, Kaitlyn Huamani and Barbara Ortutay
Updated ,first published
Los Angeles: Meta and YouTube must pay millions in damages to a 20-year-old woman after a jury decided the tech giants designed their platforms to hook young users without concern for their well-being.
The California jury’s decision in a first-of-its-kind lawsuit could influence the outcome of thousands of similar lawsuits accusing social media companies of deliberately causing harm.
The plaintiff, known by her initials KGM, testified at trial that she became addicted to social media as a child and that this addiction exacerbated her mental health struggles. After 40 hours of deliberations, a majority of jurors agreed and awarded her $US3 million ($4.3 million) in damages.
Jurors later recommended an additional $US3 million in punitive damages after deciding the companies acted with malice, oppression or fraud in harming children with their platform. The judge has final say over the damages.
It’s the second verdict against Meta this week, after a jury in New Mexico determined the company harms children’s mental health and safety, in violation of state law.
Meta and Google-owned YouTube issued statements disagreeing with the verdict and vowed to explore their legal options, which include appeals.
“We will continue to defend ourselves vigorously as every case is different,” Meta spokesperson Andy Stone said. “Teen mental health is profoundly complex and cannot be linked to a single app.”
Google spokesperson Jose Castañeda said the verdict misrepresented YouTube, “which is a responsibly built streaming platform, not a social media site”.
The jury found that Meta and YouTube knew the design or operation of their platforms was dangerous or was likely to be dangerous when used by a minor. They also agreed that the platforms failed to adequately warn of that danger, further contributing to the plaintiff’s harm.
“Today’s verdict is a referendum – from a jury, to an entire industry – that accountability has arrived,” the plaintiff’s lead counsel said in a statement.
Shares of Meta were up 1 per cent and Alphabet shares were up 0.2 per cent, little changed after the verdict.
The Los Angeles case focused on platform design rather than content, making it harder for the companies to avert liability.
Jurors listened to about a month of lawyers’ arguments, testimony and evidence, and they heard from KGM, or Kaley as her lawyers have called her during the trial, as well as Meta leaders Mark Zuckerberg and Adam Mosseri. YouTube’s chief executive, Neal Mohan, was not called in to testify.
‘Infinite scroll’ questioned
Kaley said she began using YouTube at age 6 and Instagram at age 9 and told the jury she was on social media “all day long” as a child.
Lawyers representing Kaley, led by Mark Lanier, were tasked with proving that the respective defendants’ negligence was a substantial factor in causing Kaley’s harm. They pointed to specific design features they said were designed to “hook” young users, such as the “infinite” nature of feeds that provided an endless supply of content, autoplay, and even notifications.
The jurors were told not to consider the content of the posts and videos that Kaley saw on the platforms. That’s because tech companies are shielded from legal responsibility for content posted on their sites thanks to Section 230 of the 1996 Communications Decency Act.
Meta consistently argued that Kaley had struggled with her mental health separate from her social media use, often pointing to her turbulent home life.
The company also said “not one of her therapists identified social media as the cause” of her mental health issues in a statement following closing arguments. But the plaintiffs did not have to prove that social media caused Kaley’s struggles – only that it was a “substantial factor” in causing her harm.
YouTube focused less on Kaley’s medical records and mental health history and more on her use of YouTube and the nature of the platform.
They argued that YouTube is not a form of social media, but rather a video platform akin to television, and pointed to her declining YouTube use as she got older.
According to their data, she spent about one minute a day on average watching YouTube Shorts since its inception. YouTube Shorts, which launched in 2020, is the platform’s section of short-form, vertical videos that have the “infinite scroll” feature, the plaintiffs argued, was addictive.
Lawyers representing both platforms consistently pointed to the safety features and guardrails they each have available for people to monitor and customise their use.
Snap and TikTok were also defendants in the trial. Both settled with the plaintiff before it began. Terms of the agreements were not disclosed.
States tighten social media laws
Large American technology companies have faced mounting criticism in the last decade over child and teen safety. The debate has now shifted to courts and state governments. Congress has declined to pass comprehensive legislation regulating social media.
At least 20 states enacted laws last year on social media usage and children, according to the nonpartisan National Conference of State Legislatures, an organisation that tracks state laws.
The legislation includes bills that regulate the use of phones in schools and require users to verify their ages to open a social media account. NetChoice, a trade association backed by tech companies such as Meta and Google, is seeking to invalidate age verification requirements in court.
A separate social media addiction case brought by several states and school districts against technology companies is expected to go to trial this summer in federal court in Oakland, California.
Another state trial is slated to begin in Los Angeles in July, said Matthew Bergman, one of the attorneys leading the plaintiffs’ cases. It will involve Instagram, YouTube, TikTok and Snapchat.
Separately, a New Mexico jury on Tuesday found that Meta violated state law in a case brought by the state’s attorney-general, who accused the company of misleading users about the safety of Facebook, Instagram and WhatsApp and enabling child sexual exploitation on those platforms.

