YouTube Has Removed Accounts that Misrepresent Vaccinations
YouTube banned popular accounts that spread false information about vaccines on Wednesday, as part of its anti-medical misinformation campaign. Videos that suggest approved vaccines are harmful or cause chronic health issues will be deleted.
In order to protect users from medical misinformation, YouTube does not allow material that contradicts local health authorities or the World Health Organization's (WHO) medical information regarding COVID-19 to be posted. This is restricted to material that is in conflict with World Health Organization or local health authority recommendations.
Several well-known accounts that disseminate misleading information about vaccinations will be banned by YouTube starting on Wednesday, as the company continues to broaden its anti-medical disinformation rules.
According to the new rules, any videos on the Google-owned site that imply that authorized vaccinations are hazardous or cause chronic health adverse effects will be removed.
This implies that films claiming that vaccinations cause autism, cancer, or infertility will be removed off the internet.
It will not be a problem for videos discussing vaccination policy, such as those arguing against mandates, to comply with the new standards.
As part of the policy rollout, many high-profile accounts have been suspended or terminated.
There will be no longer be any accounts for the Robert F. Kennedy Children's Health Defense Fund, alternative medicine influencer Joseph Mercola, vaccination skeptic and physician Sherri Tenpenny, or any other celebrities.
Experts have identified these people as being partly responsible for the vaccination skepticism that has delayed attempts to inoculate Americans against COVID-19 in recent years.
According to YouTube Vice President of Global Trust and Safety Matt Halprin, the platform did not take action on these well-known accounts sooner since the platform was particularly focused on coronavirus vaccination disinformation at the time.
We sought to extend what we'd begun with COVID-19 to other vaccinations as we saw misleading claims about the coronavirus vaccines spill over into disinformation about vaccines in general, he said in a statement.
YouTube, along with Facebook and Twitter, prohibited the dissemination of false material on the coronavirus during the start of the epidemic.
Despite these regulations, misleading information regarding the danger posed by the virus and the effectiveness of vaccinations has continued to thrive on the internet.