Meta and Google Just Lost a Shocking $2 Billion Lawsuit—Find Out What This Means for Your Privacy!

A California jury delivered a landmark verdict on March 25, finding Meta and Google liable for the mental health issues of a young woman who began excessively using social media as a child. The jury awarded her $6 million, marking a significant legal precedent for the tech industry. This decision reflects growing concerns about the impact of social media on youth mental health and the responsibility of tech companies in safeguarding vulnerable users.
The jury determined that the companies should pay $3 million in compensatory damages and $3 million in punitive damages, with Meta, the parent company of Instagram and Facebook, responsible for 70% of the total amount. While the financial award may seem modest compared to the trillions in market value held by these tech giants, legal experts view it as a watershed moment. This verdict is the first instance of a jury declaring social media apps as “defective products” specifically designed to exploit the developing brains of minors.
In this case, the plaintiff—identified as KGM or “Kaley”—testified about her experiences, stating that she began using YouTube at the age of six and Instagram at 11. She described how her compulsive use of these platforms led to body dysmorphia and depression, often retreating to school bathrooms to check her “likes.” Lead attorney Mark Lanier argued that the companies engineered their platforms with features like infinite scroll and constant notifications, effectively creating a “digital casino.” Lanier highlighted this point by displaying a 35-foot collage of selfies Kaley posted to Instagram while under the platform's minimum age requirement of 13. “How do you make a child never put down the phone? That’s called the engineering of addiction,” he stated during the trial.
The trial served as a crucial bellwether for about 2,000 similar lawsuits filed by parents and school districts, drawing parallels to the legal battles against Big Tobacco in the 1990s that ultimately forced the industry to change its practices regarding marketing to minors. Joseph VanZandt, co-lead attorney for the plaintiff, asserted, “Today’s verdict is a referendum—from a jury, to an entire industry—that accountability has arrived.”
In defense of their platforms, Meta CEO Mark Zuckerberg testified that the company maintained a strong safety record. However, internal documents revealed discussions among executives about the necessity to “win big with teens” and attract “tweens.” One memo indicated that 11-year-olds were four times more likely to return to Instagram than to competing apps. Google also contested the allegations, with spokesperson José Castañeda describing YouTube as a “responsibly built streaming platform” rather than a social media site.
Both companies have announced plans to appeal the verdict. They contended throughout the trial that there is no scientific evidence linking social media use to mental health issues, suggesting that Kaley’s distress stemmed primarily from her challenging home life. This argument reflects a long-standing defense strategy among tech companies, often protected by Section 230 of the Communications Decency Act, which shields platforms from liability based on users' content. However, this recent ruling positions the apps themselves as potentially defectively designed products, opening a new avenue for legal accountability regarding their influence on youth mental health.
Adding to Meta's legal troubles, a New Mexico jury recently ordered the company to pay $375 million for failing to protect children from predators on its platforms. This case underscored the company’s misleading claims about safety and violations of state protection laws.
As the implications of this verdict unfold, it is clear that the conversation around the responsibilities of tech firms in shaping the mental health landscape of younger generations is intensifying. “We wanted them to feel it,” remarked Victoria, one of the jurors, expressing the collective sentiment that such practices are unacceptable.
You might also like: