YouTube announces new policies to target medical misinformation

YouTube on Tuesday announced it is creating a new framework to crack down on medical misinformation on the platform.

“In the years since we began our efforts to make YouTube a destination for high-quality health content, we’ve learned critical lessons about developing Community Guidelines in line with local and global health authority guidance on topics that pose serious real-world risks, such as misinformation on COVID-19, vaccines, reproductive health, harmful substances, and more,” a blog post from the video-sharing site read. “We’re taking what we’ve learned so far about the most effective ways to tackle medical misinformation to simplify our approach for creators, viewers, and partners.”

The platform has faced controversy in recent years for its algorithm and the way it can direct viewers to misleading and extremist content. In 2021, YouTube said it removed more than 1 million videos related to “dangerous coronavirus information” since the beginning of the outbreak in the U.S.

YouTube said it will use three categories, “Prevention, Treatment and Denial” to sort the kinds of medical misinformation on the platform.

It will remove content that contradicts guidance from health authorities on the prevention and transmission of certain conditions, including vaccines. It will also take down content that contradicts guidance on treatments, including videos that tout unproven remedies in place of seeking care, and content that denies the existence of specific conditions, including COVID-19, according to YouTube.

The platform said its new policies “will apply to specific health conditions, treatments, and substances where content contradicts local health authorities or the World Health Organization (WHO).”

“To determine if a condition, treatment or substance is in scope of our medical misinformation policies, we’ll evaluate whether it’s associated with a high public health risk, publicly available guidance from health authorities around the world, and whether it’s generally prone to misinformation,” the post read. 

The post also highlighted new policies targeting cancer treatment misinformation. The company said it will begin to remove “content that promotes cancer treatments proven to be harmful or ineffective, or content that discourages viewers from seeking professional medical treatment.”

“This includes content that promotes unproven treatments in place of approved care or as a guaranteed cure, and treatments that have been specifically deemed harmful by health authorities,” the post continued. “For instance, a video that claims ‘garlic cures cancer,’ or ‘take vitamin C instead of radiation therapy’ would be removed.”

Tags social media algorithms social media platforms youtube

Copyright 2023 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed. Regular the hill posts

Main Area Top ↴

THE HILL MORNING SHOW

More Technology News

See All
Main Area Bottom ↴

Most Popular

Load more