
Under New ‘Medical Misinformation’ Policy, YouTube Will Delete Content That Contradicts WHO Guidance
YouTube on Tuesday announced updates to its medical misinformation policy, tightening restrictions on what it described as “harmful” claims about COVID-19, vaccines and cancer treatments, but critics said the tech giant lacks the expertise to make these judgments and its plans to restrict such...

YouTube on Tuesday announced updates to its medical misinformation policy, tightening restrictions on what it described as “harmful” claims about COVID-19, vaccines and cancer treatments, but critics said the tech giant lacks the expertise to make these judgments and its plans to restrict such content could violate people’s civil rights and stifle scientific debate.
In what one critic described as a “substantial escalation” in YouTube’s “crusade against … medical misinformation,” the social media video platform on Tuesday announced updates to its medical misinformation policy, tightening restrictions on what it described as “harmful” claims about COVID-19, vaccines and cancer treatments.
According to Reclaim the Net, YouTube’s new policy is an expansion of the platform’s existing COVID-19 misinformation policy and is intended to cover what it calls “all forms of medical misinformation.”
Under the new policy, YouTube “will streamline dozens of our existing medical misinformation guidelines to fall under three categories — Prevention, Treatment, and Denial.”
“These policies will apply to specific health conditions, treatments, and substances where content contradicts local health authorities or the World Health Organization (WHO),” YouTube stated.