Tech

Facebook might stop removing so much COVID-19 misinformation

Meta is asking the Facebook Oversight Board to make the call. Is "free expression" more important than public health?

Latin American senior man getting the COVID-19 vaccine at a vaccination stand - immunization program...
andresr/E+/Getty Images

Facebook’s large-scale operation to curb COVID-19 misinformation may soon be coming to an end. Meta today published a blog post asking the Oversight Board to weigh in on whether or not removing all COVID-19 misinformation posts is still necessary, with the intention of labeling and demoting this content rather than deleting it.

Before jumping into the meat of Meta’s request, head of global policy Nick Clegg takes a moment to pat his company on its proverbial back. “Meta has removed COVID-19 misinformation on an unprecedented scale,” he writes. “Globally, more than 25 million pieces of content have been removed since the start of the pandemic.”

Given the program’s unprecedented success, Clegg wonders whether or not it may be time to put it all to an end. Meta is “fundamentally committed to free expression,” he writes, and he wonders whether or not the COVID-19 misinformation removal policy is hurting that mission. But Clegg doesn’t want to make that call himself; he hopes the Oversight Board can do that dirty work for him.

To remove or not to remove? — Posturing aside, the focal point of Clegg’s post can be summed up in one question: Should Facebook stop removing COVID-19 misinformation? Clegg says that the question is being raised now because the pandemic has “evolved,” with life returning to some normalcy in countries with high vaccination rates.

Clegg cites Facebook’s dedication to “free expression” as the counterpoint to an ongoing removal of all COVID-19 misinformation. This comes as no surprise, given that Facebook uses this excuse any time it’s called out on how easy it is to spread misinformation on the platform.

In most cases of misinformation, Facebook will simply label the post with a link to fact-checking information. Clegg thinks taking that route with COVID-19 misinformation could make sense now, too.

The misinfo hasn’t become safer — While Clegg’s point about the pandemic changing in the last two years is valid on its own, it doesn’t really do much in context. The pandemic has changed, but misinformation about COVID-19 is no less dangerous than it was in March 2020.

“Resolving the inherent tensions between free expression and safety isn’t easy,” Clegg writes, “especially when confronted with unprecedented and fast-moving challenges, as we have been in the pandemic.” (Free expression is in Meta’s best interest, of course, as it allows for more engagement.)

The Oversight Board, as Clegg points out, is meant to provide “independent judgment” to counteract Facebook’s own biases. Thus far it hasn’t really proven itself capable of doing so. If the Oversight Board does help Meta make a decision here, it will be a telling one.