Facebook is making a major change to the way it deals with the removal of content from Pages and Groups on the world’s largest social network. It previously removed content that was found in violation of its Community Standards or rated as false by a third-party fact-checking service. The changes have been made to better prevent those who have their Pages removed from using duplicate Pages to spread the same content.
A couple of other changes have been made with regards to the quality of content. A new Page Quality tab will now be visible to people who manage Pages. It will help them understand how the Pages they are managing comply with Facebook’s guidelines. The tab has two sections one of which details the content recently removed from violating Community Standards and content from the Page that was recently rated “False,” “Mixture” or “False Headline” by third-party fact-checkers.
It’s initially including content removed for policies like graphic violence, hate speech, harassment, bullying, regulated goods, nudity or sexual activity, and support or praise of people and events that are not allowed to be on Facebook in this tab. It does mention that this tab won’t provide a comprehensive accounting of all policy violations so it won’t be showing removals at this time for things like IP violations, spam, and clickbait.
Facebook has long prevented people from creating Pages, Groups, events or accounts that are similar to the ones previously removed for violation of Community Standards. People have been working around this by using existing Pages that they already manage for the same purpose as the one that was removed.
In order to plug this loophole, Facebook will now proactively remove other Pages and Groups after it removes a particular one for violation even if that specific Page or Group has not met the threshold to be unpublished on its own. It will enforce this policy by seeing if the Page has the same people administering or has a similar name to the one that’s being removed.