You Won’t Believe Which Source People Trust More Than Doctors for Health Advice—Shocking Study Reveals!

In a striking revelation about the reliability of health information accessed through Google’s AI-generated search summaries, a recent study found that the platform's AI Overviews frequently cites YouTube more than any other medical websites. This finding raises significant concerns about the quality of health information available to millions of users worldwide.
According to research conducted by SE Ranking, a search engine optimization platform, YouTube accounted for 4.43% of all citations in AI Overviews for health-related queries. This analysis was based on results from more than 50,000 health queries collected from Google searches in Germany, a country known for its stringent healthcare regulations. In stark contrast, traditional medical sources such as hospital networks and government health portals were far less frequently cited.
Google maintains that its AI Overviews are designed to highlight high-quality content from reputable sources, including organizations like the Centers for Disease Control and Prevention and the Mayo Clinic. However, the study's findings indicate that YouTube, a platform where anyone can upload videos—including wellness influencers and creators lacking formal medical training—was the most cited source, with 20,621 citations out of a total of 465,823.
Researchers pointed out the potential dangers inherent in this trend, stating, “This matters because YouTube is not a medical publisher.” They emphasized that the platform's general-purpose nature could expose users to misleading health information. This concern is underscored by a previous investigation by the Guardian, which found that some AI Overviews responses contained false or misleading health information, potentially putting individuals at risk of serious health issues.
In one alarming case, Google erroneously provided information about crucial liver function tests that could mislead individuals with serious liver disease into believing they were healthy. Following such findings, Google took the step of removing AI Overviews for certain medical searches, although this change was not applied uniformly across all queries.
The SE Ranking study specifically analyzed the sources referenced by AI Overviews in December 2025, focusing on German-language health queries. The results indicated that AI Overviews were present in over 82% of all health-related searches. Beyond YouTube, the next most cited sources included NDR.de, a German public broadcaster, with 14,158 citations, and Msdmanuals.com, a medical reference site, with 9,711 citations.
Despite the study's limitations—such as its focus on a single language and country—the implications of its findings raise alarms about the structural risks posed by AI Overviews. Hannah van Kolfschooten, a researcher at the University of Basel, commented on the significance of the study, noting, “This study provides empirical evidence that the risks posed by AI Overviews for health are structural, not anecdotal.” She highlighted that the heavy reliance on YouTube suggests that popularity rather than medical credibility drives health information visibility on Google.
In response to the findings, a Google spokesperson stated that the report's data does not support claims of unreliability. They highlighted that the majority of the most cited YouTube videos (96%) were from medical channels. However, the researchers cautioned that this small subset does not represent the broader landscape of citations, as it comprises less than 1% of all YouTube links used by AI Overviews.
This situation begs important questions about how AI is shaping our access to health information. As reliance on AI tools like Google’s AI Overviews continues to grow, it becomes imperative to scrutinize the sources they prioritize. If these platforms primarily cite content from general-purpose sites like YouTube instead of established medical institutions, users may be unwittingly exposed to unverified and potentially harmful health advice.
You might also like: