YouTube is taking a tougher stance against misinformation about COVID vaccines with updated rules

It’s been a long time, but YouTube announced today that it will take a stronger stance against misinformation about COVID-19, and in particular, misleading content regarding COVID vaccines, which could help boost global anti-vaccine movements.

As YouTube explained:

Creating a policy on medical disinformation brings with it inherent challenges and compromises. Scientific understanding develops as new research emerges, and first-hand personal experience regularly plays a powerful role in online discourse. Vaccines in particular have provoked heated debate over the years, despite consistent guidance from health authorities on their effectiveness. Today, we are expanding our medical misinformation policy on YouTube to new ones guidelines on currently administered vaccines that have been approved and validated as safe and effective by local health authorities and the WHO. ”

The updated policy is clearly stated:

“Do not post content on YouTube if it contains harmful misinformation about currently approved and applied vaccines about any of the following:

  • Vaccine safety – Content stating that vaccines cause chronic side effects, except for rare side effects recognized by health authorities
  • Vaccine efficacy – Content that claims that vaccines do not reduce disease transmission or spread
  • Ingredients in vaccines – Content that falsely represents substances contained in vaccines ”

Videos containing any of these claims will now be subject to removal from YouTube, and channels that post them will first issue a warning and then warnings. Any channel that receives three reminders within 90 days will be canceled.

To be clear, YouTube’s policies already prohibit certain types of medical misinformation, including content that promotes harmful drugs, with YouTube further noting that it has been removed more than 130,000 videos for violating COVID-19 vaccine rules over the past year.

The platform has developed its approach to this over time, in the midst of an ongoing pandemic, but has also been identified as a key source of medical misinformation, which has helped encourage harmful movements, which is contrary to the goals of health authorities.

Last year, the ‘Plandemic’ video that triggered the conspiracy was viewed more than 7 million times on YouTube before it was eventually removed. The video sought to amplify rumors that it was The National Institute of Allergy and Infectious Diseases has buried research into how vaccines can damage people’s immune systems.

Earlier this year, researchers from the universities of Oxford and Southampton discovered this people seeking information on social media – especially YouTube – are less willing to be vaccinated against COVID-19 and have called on the government and companies on social media to take urgent action on the findings.

According to the report:

Trust in health facilities and professionals and perceived personal threats are vital, and focus groups find that hesitation in the COVID-19 vaccine is caused by a misunderstanding of herd immunity as protection, fear of rapid vaccine development and side effects, and belief that the virus is artificial. production and is used to control the population. In particular, those who receive information from relatively unregulated social media sources – such as YouTube – who have recommendations adapted to their viewing history and who have general conspiratorial beliefs, are less willing to be vaccinated. “

Given, as noted, YouTube’s tougher stance has been coming for a long time and it’s good to see a platform that takes a clearer approach against apparently inaccurate medical data, which can cause significant real-world impacts for both individuals and society at large.

But not everyone will like the move and could put YouTube on a course of conflict with some regional authorities.

This week, the Russian government threatened to block YouTube if the platform does not renew two German-language channels operated by Russia’s state-owned media company RT, which were deleted following the release of vaccine misinformation.

According to the Washington Post:

“Russia’s Ministry of Communications, Roskomnadzor, said it had sent a letter to Google” demanding that all restrictions on the YouTube channels RT DE and Der Fehlende Part, managed by the Russian media Russia Today, be lifted as soon as possible, Interfax reported. “The ministry has threatened to completely or partially restrict YouTube in Russia, or penalize Google if the channels are not renewed.”

Indeed, there are many politicians and civic leaders who support those who oppose vaccine mandates, and you can bet that a large platform like YouTube that sharpens such attitudes will again raise concerns that private companies control the media story, and have too much control over what can and cannot share.

Which is a valid concern, but as YouTube notes, it works with official health councils, based on both local health authorities and the World Health Organization, so as such an attitude is placed in the guidelines governing all our health procedures and approaches, which protects us from such strikes.

Many opposing views are based on misunderstanding, and their spread is dangerous, so it will keep us in the pattern of holding COVID even longer, which could have catastrophic long-term consequences. It is therefore good to see that YouTube takes a stronger stance and seeks to apply the findings of official health authorities within its moderation approach.

Source link

Naveen Kumar

Friendly communicator. Music maven. Explorer. Pop culture trailblazer. Social media practitioner.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button