What's the news?
Facebook Inc said on Wednesday it would take "stronger" action against people who repeatedly share misinformation on the platform. Facebook will reduce the distribution of all posts in its news feed from a user account if it frequently shares content that has been flagged as false by one of the company's fact-checking partners, the social media giant said in a blog post.
More to know
Facebook added that it was also launching ways to inform people if they are interacting with content that has been rated by a fact-checker."Whether it's false or misleading content about COVID-19 and vaccines, climate change, elections or other topics, we're making sure fewer people see misinformation on our apps," the company said in a statement.