Facebook Warning: Action Against Users Who Share Misinformation Repeatedly

Facebook will take “stronger” action against users who repeatedly share misinformation on the social media platform, reported Reuters.

In a blog post, Facebook stated that it will reduce the distribution of all posts in its news feed from a user account if it is found to be frequently sharing content that has been flagged as false by one of the company’s fact-checking partners.

Facebook will also be introducing ways to inform people if they are interacting with content that has been rated by a fact-checker.

Various social media platforms have been used to spread false news and claims. It multiplied several times since the outbreak of the COVID-19 pandemic last year. False claims, conspiracies, etc. have been floated on platforms like Facebook and Twitter.

“Whether it’s false or misleading content about COVID-19 and vaccines, climate change, elections or other topics, we’re making sure fewer people see misinformation on our apps,” Facebook Inc said in a statement.

The social media giant took down 1.3 billion fake accounts between October and December 2020, ahead of an inspection by US House Committee on Energy and Commerce into how technology platforms are tackling misinformation.

Get real time updates directly on you device, subscribe now.

Comments are closed.