Facebook has taken considerable steps to address the flow of misinformation on its platform over the past few years. Its strategy has involved removing the content that violates its policies, reducing the spread of problematic content which doesn’t violate policies, and informing people with additional information about what they click, read or share. It will now reduce the reach of groups that repeatedly share misinformation.
The company met with a small group of journalists in Menlo Park today to discuss its latest remove, reduce, and inform updates to ensure the integrity of information that flows on Facebook. To that end, it will be rolling out a new section on its Community Standards site where people will be able to track the updates it makes every month.
It’s also starting a collaborative process with outside experts in order to find new ways to fight fake news on Facebook more quickly. The company will now reduce the reach of Facebook Groups that repeatedly share misinformation. This means that their posts will show up fewer times in the News Feed. Facebook is also expanding the content that the Associated Press will review as a third-party fact-checker.
The reduction in reach will happen when people in a group repeatedly share content that has been rated as false by independent fact-checkers. Facebook will thus reduce the group’s overall News Feed distribution. This change goes into effect globally starting today.