In an attempt to curb the spread of misinformation, Facebook will be providing context for fact-checked content, expanding penalties for users sharing misinformation, and will be launching redesigned notifications for posts that have been debunked.
The new ways to inform people if they are interacting with content containing misinformation would be rated by fact-checkers. In addition to this, the platform will also be taking action against people who repeatedly share misinformation on Facebook.
False or misleading content about COVID-19 and vaccines, climate change, elections, and other topics, would be covered, Facebook added in their blog post.
More context for pages that repeatedly share false claims
Users will see a pop-up if they go to like one of the Pages that according to Facebook findings have repeatedly shared content that fact-checkers have rated false. Users can also click to learn more, including a link to more information about the fact-checking program of Facebook.
Expanding penalties for individual Facebook accounts
The platform is expanding penalties against the accounts that constantly share misinformation. The platform will aim to reduce the distribution of all posts in News Feed from an individual’s Facebook account if they repeatedly share content that has been rated by one of Facebook's fact-checking partners.
Also Read: Instagram & Facebook roll out the option to remove likes
Redesigned notifications when people share fact-checked content
Facebook currently notifies people when they share content that a fact-checker later rates. Now, Facebook has redesigned these notifications to make it easier to understand when this happens. The notification includes the fact-checkers article debunking the claim as well as a prompt to share the article with their followers. It also includes a notice that people who repeatedly share false information may have their posts moved lower in News Feed so other people are less likely to see them.