Meta's Shocking Secrets Unveiled: Did They Hide Evidence of Social Media's 80% Harm?

SAN FRANCISCO — Internal research conducted by Meta, the parent company of Facebook and Instagram, revealed causal evidence that its platforms have detrimental effects on users' mental health, according to unredacted documents involved in a class action lawsuit by U.S. school districts against Meta and other social media companies. The company reportedly halted this important research after discovering that users who deactivated Facebook for just a week experienced significant reductions in feelings of depression, anxiety, loneliness, and social comparison.
The internal study, known as "Project Mercury," was initiated in 2020, employing survey firm Nielsen to assess the mental health impact of deactivating Facebook and Instagram. To Meta's dismay, findings indicated that participants who stepped away from the platforms experienced an improvement in their mental well-being. However, instead of pursuing further research or making these findings public, Meta decided to cease the project, claiming that the negative results were influenced by the "existing media narrative" surrounding the company.
Privately, Meta staff remained concerned about the implications of the research. One unnamed researcher expressed the worry that ignoring these harmful findings could be likened to the tobacco industry's historical behavior of downplaying the dangers of smoking. Nick Clegg, who was then Meta's head of global public policy, was reportedly assured by staff that the research conclusions were valid.
Despite acknowledging the negative impacts of its own products, Meta has publicly claimed to Congress that it lacks the ability to quantify whether its platforms are harmful to teenage girls. In response to the ongoing lawsuit, a Meta spokesperson, Andy Stone, stated that the study was halted due to "flawed methodology" and emphasized that the company has been committed to enhancing user safety.
Plaintiffs allege product risk was hidden
The accusations against Meta regarding the suppression of research on social media's harms are part of a broader legal action led by the law firm Motley Rice. This lawsuit includes claims against other major platforms like Google, TikTok, and Snapchat, asserting that these companies have deliberately concealed the risks associated with their products from users, parents, and educators.
Allegations outlined by the plaintiffs suggest that these platforms not only encourage children under the age of 13 to use their services but also fail to adequately address child sexual abuse content. Furthermore, the lawsuit contends that these companies sought to expand their user base among teenagers during school hours and attempted to financially influence child advocacy organizations to publicly defend the safety of their products.
For instance, TikTok reportedly sponsored the National PTA, boasting internally about their ability to influence the organization. Documents from the lawsuit indicate that TikTok officials expressed confidence that the PTA would comply with their future agenda, including public announcements and press statements.
The allegations against Meta are particularly severe, highlighting several concerning internal practices: the company allegedly designed its youth safety features to be ineffective and rarely utilized, required 17 instances of attempted sex trafficking before removing users from its platform, and acknowledged that optimizing its products for greater teen engagement often resulted in exposure to harmful content. Additionally, Meta allegedly delayed internal initiatives aimed at preventing child predators from contacting minors, prioritizing growth over safety.
In a 2021 text message, Mark Zuckerberg indicated that child safety was not his primary focus, stating that he had "a number of other areas I'm more focused on, like building the metaverse." Meta’s Stone, however, refuted these allegations, asserting that the company's efforts to ensure teen safety are effective and that they actively remove accounts flagged for sex trafficking.
The original internal documents cited in the lawsuit are currently sealed, with Meta opposing any motion to unseal them. A hearing regarding these allegations is set for January 26 at the Northern California District Court.
This unfolding saga raises significant questions about the responsibility that tech giants hold in protecting their youngest users and the transparency needed in the research surrounding their platforms. As more details emerge from the ongoing lawsuits, both the public and policymakers will be watching closely to better understand the implications for mental health and child safety in the digital age.
You might also like: