YouTube has removed more than 137,000 videos because of COVID-19 vaccine misinformation since last year
YouTube is going on the offensive against the spread of dangerous misinformation surrounding approved vaccines, including those for COVID-19.
The company announced in a blog post on Wednesday that it will be removing any videos posted on its platform that spread misinformation about any approved vaccines, COVID-19 or otherwise.
YouTube says it has already removed more than 137,000 videos because of COVID-19 vaccine misinformation since last year, and states in its guidelines that users who receive three strikes to their channel within 90 days will have their channel deactivated.
“Today’s policy update is an important step to address vaccine and health misinformation on our platform, and we’ll continue to invest across the board in the policies and products that bring high quality information to our viewers and the entire YouTube community,” the company said.
YouTube’s vaccine misinformation policy is aimed at users who post harmful misinformation related to vaccine safety, the ingredients in vaccines, and vaccine efficacy.
Among the content that YouTube says will not be tolerated are false claims that vaccines can cause side effects such as cancer, diabetes, and infertility, or alter a person’s genetic makeup.
“Working closely with health authorities, we looked to balance our commitment to an open platform with the need to remove egregious harmful content,” YouTube said.
YouTube says it became clear it needed to expand its guidelines after seeing misinformation about the COVID-19 vaccine began expanding to all vaccines.
“We’ve steadily seen false claims about the coronavirus vaccines spill over into misinformation about vaccines in general, and we’re now at a point where it’s more important than ever to expand the work we started with COVID-19 to other vaccines,” the company said.
YouTube similarly instituted guidelines to help curb the spread of misinformation surrounding COVID-19 after the virus began to spread across the globe in early 2020.
The company does not allow users to post content about COVID-19 that spreads medical information related to the treatment, prevention, diagnosis, and transmission of the virus, among other things.