Facebook has banned (only) 3,000 accounts spreading false information about covid-19

·2-min read
Facebook revealed that more than 18 million users have used filters displaying their support for vaccines on their profile pictures.

Is Facebook effectively fighting misinformation about covid-19? The American giant has revealed the results of an internal analysis of its actions since the beginning of the health crisis. While banning only 3,000 accounts and other pages, Facebook has nevertheless observed a general increase in confidence in vaccines in several countries. We take stock of the situation.

The fight against misinformation remains a major issue for online platforms. Facebook is constantly developing new features to curb the spread of false and misleading information on its platform. Often criticized for a lack of effectiveness, the American giant clearly wishes to demonstrate the opposite by publishing the results of its tenth "Community Standards Enforcement Report" this August 18.

20 million pieces of content deleted on Instagram and Facebook
More than a year after the beginning of the pandemic, Mark Zuckerberg's social network is revealing some results of its moderation policy. Facebook said that from the beginning of the health crisis until June 2021, it deleted more than 3,000 accounts, pages, and groups that violated their policy against the dissemination of misinformation about covid-19.

The number may seem paltry when compared with the amount of false information circulating online. However, Facebook points out that more than 20 million pieces of content have been removed on its platform as well as on Instagram for violating their covid-19 policies and protections since the global crisis began.

In December 2020, Facebook had already launched a tool to send a notification to users who had accessed false information on its platform. "We posted warnings on more than 190 million pieces of covid-related content on Facebook that our third-party fact-checking partners classified as false, partially false, contextually altered or missing, collaborating with 80 fact-checking organizations in more than 60 languages around the world," the platform revealed.

More trust in vaccines
Facebook highlighted its effectiveness in changing its users' attitudes toward vaccines. While a study unveiled in March 2021 found that even small groups of users could make larger audiences doubt the efficacy of vaccines, it seems their efforts have eventually paid off. The platform noted an increase in the trust index regarding vaccines. According to a collaborative study between Facebook, Carnegie-Mellon University, and the University of Maryland, vaccine acceptance has increased by 35% in France, 25% in Indonesia and 20% in Nigeria since the beginning of the year. In the United States, distrust of vaccines dropped by a drastic 50%.

In April 2021, Facebook set up a notification system to alert users of vaccination campaigns according to their location. Facebook congratulated itself for having highlighted access to vaccines for 4 million users in the United States thanks to this tool. Nearly 2 billion people have consulted expert sources on their "Covid-19 - Information Center" page and through pop-ups on Instagram. More than 7.6 million users worldwide have used covid-19 pro-vaccine stickers on Instagram.

Sabrina Alili

Our goal is to create a safe and engaging place for users to connect over interests and passions. In order to improve our community experience, we are temporarily suspending article commenting