Facebook is expanding what false claims it will remove from its platforms related to COVID-19, COVID-19 vaccines, and vaccines in general starting today. The company began removing debunked COVID-19 claims in December of last year and notifying customers when they had interacted with a post that has false information that same month. But now the list of potential claims that could get a post removed has grown.
Highlights from the new expanded list of false COVID-19 and vaccine-related claims that will be removed include:
Facebook says it will start enforcing this policy immediately, focusing on groups, pages, and accounts that share content from its new list of debunked claims. The company also says it would consider removing the sources of the posts entirely if they became repeat offenders.
Notably, the company says that it will only be enforcing this change during the “COVID health emergency,” so while tamping down on such claims could be a major blow to the anti-vaccine movement on Facebook, it might not last long. Even if it remains brief, it’s an important change, Facebook was a major source of vaccine misinformation even before the pandemic and addressing it more directly could have a meaningful impact on people who might have otherwise become anti-vaxxers.
Expanding what counts as COVID-19 and vaccine misinformation is a smart move for Facebook, but some people worry what posts might get caught in the company’s new, larger misinformation net. Studies into the effectiveness of certain masks, vaccines, and tests are still ongoing. As written, Facebook’s new guidelines might prevent conversations around new research results, as UNC Professor Zeynep Tufekci notes.
Looking at the list, Facebook may have to take down some current real news and public health statements, too. We have ongoing clinical trials with no placebo, for example. (UK heterologous prime boost trial) Also today’s reports on ChAdOx1? May need to go under these guidelines. pic.twitter.com/iCCNi6eeBT
What’s more, Tufekci points out, recommendations from public health agencies have changed over the course of the pandemic, which may mean that older posts from organizations like the World Health Organization may also be removed. The Verge has contacted Facebook with these concerns and will update if we learn more.
Outside of those policy changes, Facebook is also making adjustments to how factual COVID-19 information gets delivered on Facebook and Instagram. The company will feature links to vaccine information and for signing up to receive a vaccination in its COVID-19 Information Center, and it plans to bring the feature to Instagram as well.
Facebook also says that its continuing to improve search on both platforms to surface more “relevant, authoritative results” when a user searches something related COVID-19, including displaying users who discourage vaccinations lower in search results on Instagram. Finally, Facebook is extending $120 million in ad credits to “help health ministries, NGOs, and UN agencies” spread COVID-19 vaccine information to Facebook’s billions of users.