Tech

Facebook intensifies vaccine misinformation crack down

Published

on

Facebook is rolling out stricter policies on vaccine misinformation

The social media giant, owned by Meta (it’s brand new parent company identity) announced it is stepping up its crack down on vaccine misinformation following recent approvals made by the FDA.

The U.S. Food and Drug Administration approved the Pfizer vaccine for children between the ages of five and 11 last week, opening up the shots to a new age group and ending an anxious wait for families of children in that bracket.

Due to the new approval, social media companies are now gearing up to deal with posts containing misinformation.

The platform previously put restrictions on COVID-19 vaccine misinformation in late 2020, but didn’t have policies specific to kids.

The big changes to misinformation policies:

Meta says in a new blog post that it’s partnering with the Centers for Disease Control and Prevention and the World Health Organization to take down harmful content related to children and the COVID-19 vaccine.

This includes any posts that imply the COVID-19 vaccine is unsafe, untested, or ineffective for children.

Additionally, Meta will provide in-feed reminders in English and Spanish that the vaccine has been approved for kids, and will also provide information about where it’s available.

Facebook’s parent company revealed that it has taken down a total of 20 million pieces of COVID-19 and vaccine misinformation from both Facebook and Instagram since the start of the pandemic.

Trending Now

Exit mobile version