Health & Wellness

YouTube Bans All Content That Spreads Vaccine Misinformation

todayOctober 1, 2021

Background
share close

YouTube is cracking down on the spread of misinformation by banning misleading and inaccurate content about vaccines.

The platform announced the change in a blog post Wednesday, explaining that its current community guidelines, which already prohibit the sharing of medical misinformation, have been extended to cover “currently administered” vaccines that have been proven safe by the World Health Organization and other health officials.

The site had previously banned content containing false claims about COVID-19 vaccines under its COVID-19 misinformation policy. The change extends that policy to a far wider number of vaccines. “We’ve steadily seen false claims about the coronavirus vaccines spill over into misinformation about vaccines in general, and we’re now at a point where it’s more important than ever to expand the work we started with COVID-19 to other vaccines,” the company said.

YouTube said it now bans videos that claim vaccines aren’t safe or effective or cause other health issues such as cancer and infertility. In its announcement, the company pointed specifically to videos that inaccurately describe what ingredients are used in vaccines as well as allegations that vaccines contain properties that can be used to “track” those who receive them.

There are some exceptions: Users are still allowed to share content related to their personal experiences with the vaccine, but only if those videos adhere to the site’s community guidelines and the channel in question doesn’t routinely encourage “vaccine hesitancy.”

The new mandate goes into effect immediately, and YouTube has already removed pages known for sharing anti-vaccination sentiments such as those belonging to prominent vaccine opponents Joseph Mercola, Erin Elizabeth, Sherri Tenpenny and Robert F. Kennedy Jr.’s Children’s Health Defense organization, CNBC reported. But the company, which is owned by Google, warned the more widespread removal of videos may take some time as it works to enforce the policy.

As big tech companies such as YouTube and Facebook have tightened their restrictions regarding vaccine misinformation over the last year, many conspiracy theorists began migrating to other less-regulated platforms. Rumble, another video-sharing site, has become a popular choice for far-right groups and others who are vaccine-resistant, Slate reported in March.

But many conservative pages that spread vaccine misinformation are still active on YouTube, and their videos continue to attract millions of views.


Photo Credit: d8nn / Shutterstock.com

Written by: Vipology Staff Writer

Rate it

Previous post

Fashion

Kanye West releases Yeezy Gap sweatshirt for $90 that “Looks like every hoodie at Walmart”

Kanye West released the second clothing drop of his Yeezy collaboration with Gap, and to the surprise of many, it's a simple hoodie. The rapper-turned-designer made his relationship with the retailer official in June 2020, announcing that the first pieces to come out of the 10-year Yeezy Gap deal would be introduced in the first half of 2021. One year later, in conjunction with West's 44th birthday, Gap released the […]

todayOctober 1, 2021


Subscribe

LISTEN WITH YOUR APP

0%