Facebook announced this week that it’s going to reduce the reach of anti-vaccine content on the world’s largest social network. Such content will no longer be allowed to be promoted through recommendations or advertisements and will also be made less prominent in search results. These are the actions that it’s taking short of taking down anti-vaccine posts entirely.
Social media platforms like Facebook and YouTube have received a lot of criticism lately for enabling misinformation about vaccines to spread through their platforms. Facebook did say last month that it was looking into ways to address the matter.
In addition to not taking down anti-vaccine posts completely, Facebook has also said that it’s exploring ways to provide users with more context about vaccines from “expert organizations.” Facebook’s vice president of global policy management Monika Bickert wrote in a blog post that the social network will no longer accept ads that have false information about vaccinations. It has also removed targeting categories like “vaccine controversies” from the advertising tools.
The social network is also going to reduce the ranking of groups and pages that spread misinformation about vaccines in News Feed and search results on the platform. These efforts also extend to Instagram, which is owned by Facebook, as such content will no longer be recommended on Instagram’s Explore page.