Instagram CEO Claims No Addiction, But Shocking Data Shows 70% of Users Disagree—Are You Among Them?

The relationship between digital platforms and mental health has increasingly drawn public interest and legal scrutiny. Concerns continue to mount over the compulsive nature of modern apps, yet the companies behind these platforms often present a contrasting narrative regarding how their technology impacts users. Recently, during a court appearance, Adam Mosseri, the head of Instagram, defended his platform against accusations of contributing to mental health issues among minors, asserting that social media does not fit the definition of being “clinically addictive.”
In his testimony in Los Angeles, Mosseri directly addressed claims that Meta—Instagram’s parent company—intentionally designs features to keep children and adults engaged, potentially leading to adverse effects on their lives. The plaintiff’s argument suggests that such design choices prioritize user engagement over the well-being of young people.
Mosseri argued for a distinction between “clinical addiction” and what he termed “problematic use,” stating, “While some people spend more time on the app than they feel good about, this does not constitute a medical addiction.” He likened this behavior to binge-watching a Netflix series, suggesting that while one may feel "addicted" to a show, it does not equate to a clinical condition.
During the proceedings, the lead attorney presented data indicating that approximately 60% of users reported experiencing bullying on Meta’s platforms within just one week. In response, Mosseri claimed he was unaware of the survey results. This exchange highlights a troubling perception that Meta may prioritize growth over its social responsibilities, a stance that could severely damage its reputation.
The trial, which is expected to last six weeks, seeks to hold major tech firms accountable for the negative impact their platforms might have on young users. Outside the courtroom, the atmosphere differed starkly from the legal discourse. Mosseri faced a large crowd of onlookers, protestors, and concerned parents—many of whom, while not directly involved in the lawsuit, were determined to voice their concerns about social media’s ramifications on children.
One parent, Mariano Janin from London, brought a photo of his daughter, Mia, who tragically took her own life in 2021 at the age of 14. Janin traveled to Los Angeles specifically to advocate for stricter regulations on social media usage for young people, stating, “If they changed their business model, it would be different. They should protect kids. They have the technology; they have the funds.” His statements emphasize a widespread belief that companies should prioritize the safety of users, especially vulnerable populations like youth, over profits.
Online reactions also reflected skepticism towards Mosseri’s assertions. On Instagram, one user commented, “Big Tobacco said the same thing decades ago,” drawing parallels between the social media and tobacco industries—both of which rely on habitual use to drive revenue. Another user provocatively asked, “What if it’s your own kid, though? How about then Mosseri? Still not an addiction?” This sentiment was echoed by many who suspect that tech executives understand the risks better than they convey.
Reddit discussions further underscored the public’s concerns. One user noted, “Hard to take that seriously when the platforms are literally designed to keep people scrolling.” Features like infinite scrolling, often referred to as “dopamine-scrolling,” are specifically engineered to eliminate natural breaks that signal our brains to stop. Another Redditor highlighted the financial incentives, asserting that algorithms are legally required to maximize addictive potential in pursuit of shareholder profits, raising ethical questions about the business models of these companies.
Critics pointedly remarked, “It is difficult to get a man to understand something when his salary depends on his not understanding it,” suggesting that Mosseri’s position may be compromised by financial interests. Questions about his qualifications in behavioral psychology also arose, challenging his authority to make claims regarding social media’s effects on mental health.
The most poignant reactions came from those personally affected by social media’s impact on young lives. A user shared, “Tell that to the teenagers who threaten to kill themselves if they lose access to social media or their phones.” A report from the U.S. Surgeon General noted that teenagers who spend more than three hours a day on social media are twice as likely to experience mental health issues, including sadness and anxiety. For many families, the nuances of clinical definitions mean little against the real-life crises they face daily.
As parents navigate the challenges of managing their children’s screen time, mental health, and technology use, statements from tech leaders about product safety can complicate matters. When a platform’s head claims that a product is not addictive, it places the responsibility solely on families to regulate their usage of systems designed to capture their attention. This lack of transparency can leave families feeling helpless, pushing them to adopt their own safety measures, such as monitoring the content their children consume.
In a digital age where social media significantly shapes self-perception and interpersonal connections, leaders' refusal to acknowledge inherent risks may influence societal norms surrounding self-worth and dependency on technology. If executives like Mosseri continue to deny the addictive nature of their platforms, users must remain vigilant in establishing boundaries that protect their mental well-being from algorithms focused solely on maximizing engagement.
You might also like: