YouTube announced that it’s cracking down on medical misinformation by removing videos that go against its policies, including those recommending “harmful or ineffective” cancer treatments and “cures.” The video platform is making its medical misinformation policy more robust, having already added new rules to ensure abortion safety last month.
YouTube will follow a framework targeting videos that showcase prevention, treatment, and denial of different health ailments based on unproven, harmful, and ineffective methods. It’s also taking down videos that directly contradict health authorities on topics prone to misinformation, like cancer, Covid-19, and vaccines.
Also: 75% of content creators are stressed out. Here’s what helps
“While specific medical guidance can change over time as we learn more, our goal is to ensure that when it comes to areas of well-studied scientific consensus, YouTube is not a platform for distributing information that could harm people,” YouTube shared in a blog post written by Dr. Garth Graham, Director and Global Head of Healthcare and Public Health Partnerships; and Matt Halprin, VP and Global Head of Trust and Safety.
The updated misinformation policy includes removing content that offers dangerous medical advice, discourages seeking professional care or the use of medically necessary treatment, denies the existence of well-established conditions, contradicts the guidance of local health authorities or the World Health Organization (WHO), and makes unproven treatment claims.
Also: How to download YouTube videos for free, plus two other ways
For example, this would include videos asserting that Type 1 diabetes is reversible through diet changes alone, without the use of monitoring or medications like insulin, as this has no scientific basis and discourages the use of medically necessary treatment.
According to the blog post, this policy will be applied when the content in the videos is associated with high public health risk, publicly available guidance from health authorities around the world, and if it’s generally prone to misinformation.
Also: TikTok creators will need to disclose AI-generated content, or else
Some exceptions to the policy will include videos of an educational nature or in a scientific context, as well as documentaries. However, YouTube is adamant these videos still cannot actively discourage seeking professional care.
“This means that we may allow content that is sufficiently in the public interest to remain on YouTube, even if it otherwise violates our policies,” the Youtube blog post explained. “For example, a video of a public hearing or comments made by national political candidates on the campaign trail that disputes health authority guidance, or graphic footage from active warzones or humanitarian crises.”
Also: We’re not ready for the impact of generative AI on elections
YouTube’s battle with medical misinformation isn’t new; the platform has previously been in the spotlight for removing videos touting Covid-19 misinformation over the past three years.
The platform also rolled out changes to its elections misinformation policies in June, when it announced it would “stop removing content that advances false claims that widespread fraud, errors, or glitches occurred in the 2020 and other past US Presidential elections.”