Facebook is expanding its policy against misinformation about the Covid-19 vaccine to include children
In a rare preventive move to combat misinformation before becomes viral, Meta, a technology giant formerly known as Facebook, is in partnership with the Centers for Disease Control and Prevention and the World Health Organization to remove harmful content related to the coronavirus vaccine and its effects on children. Announcement coincided with the approval of the U.S. Food and Drug Administration for the first covid-19 vaccine for children between 5 and 11 years of age.
In the coming weeks, Facebook users will begin to see reminders in food that the vaccine has been approved for children, along with information on where it is available. Meta publishes both the English and Spanish versions of the reminder.
It is also expanding its anti-vaccine disinformation policy to remove false claims that specifically apply to the vaccine and children. This includes misinformation about the availability, efficacy, and verification of the vaccine in the scientific community, such as claims that the covid-19 vaccine can kill or seriously harm children. Posts claiming that any treatment other than the covid-19 vaccine can inoculate children against the virus will also be removed.
“This is not an update, but part of an ongoing effort in partnership with health authorities such as the CDC and WHO and others, both in the United States and around the world,” Metin Health Chief Kang-Xing Jin wrote in blog post announcing the partnership on Friday. “We will continue to clarify our rules and add new claims about the COVID-19 vaccine for children that we are removing from our applications.”
FDA approval covers the mfNA vaccine Pfizer / BioNTech, which is estimated to be 90.7% effective in preventing covid-19 in children aged 5 to 11 years. The version of the vaccine for children, which comes in a smaller dose than the version available for adults, will be offered as a two-dose injection over a period of three weeks. No serious vaccine-related adverse events were reported in the trials.
G / O Media may receive a commission
Over the years, Meta has often been the target of criticism for failing to curb the rapid spread of misinformation on its platform, a problem that has come to full relief with the covid-19 pandemic. Right-wing conspiracy theorists, anti-vaxers and pseudoscience fanatics have always had stronghold on Facebook, and it wasn’t long before they started flooding the feeds shit about covid-19 and the effectiveness of wearing masks, adding later propaganda about the coronavirus vaccine on the list.
Amid growing critical pressure, Facebook has implemented several new policies to remove or limit the reach of harmful content. But many argue that his response was too little, too late. President Joe Biden accused the company of “killing people” allowing false claims about the vaccine to remain unverified on its platform. And Facebook is not over yet performing damage control: The hashtag #VaccinesKill, which would look like an obvious candidate for a hammer to ban Facebook, has remained operational only recently July before Facebook finally blocked it.
Since the start of the pandemic, Meta has removed a total of 20 million content and 3,000 accounts, pages and groups from Facebook and Instagram regarding misinformation about kovid-19 and the vaccine, the company said on Friday.
Friendly communicator. Music maven. Explorer. Pop culture trailblazer. Social media practitioner.