Meta asks Oversight Board if it should stop removing COVID misinformation
Meta, the parent company of Facebook, asked the board tasked with overseeing policy debates on the platform to weigh in on whether it should ease its COVID misinformation policy.
The company asked the Oversight Board, a task force funded by Meta that operates independently, if the company’s policy, including to remove posts with false claims, is still appropriate as the pandemic “has evolved,” Meta’s President of Global Affairs Nick Clegg wrote Tuesday in a blog post.
Facebook is asking the board to issue an advisory opinion on whether the current measures are appropriate, or if the company should address misinformation through other means such as labeling or demoting content.
Facebook broadened its misinformation policies in January 2020 to include removing posts with misinformation related to the COVID-19 pandemic. The company expanded the policy to remove false claims about the coronavirus vaccine in late 2020, as vaccines became available.
Before the updates, Facebook only removed misinformation when local partners with “relevant expertise” flagged a particular post to the company that could contribute to a risk of imminent physical harm.
The company is aiming to create a policy that will address varying concerns globally, since vaccination rates and the state of returning to normal life are different based on country circumstance.
Clegg said the change led Meta to remove COVID-19 misinformation on an “unprecedented scale.” He said the company removed more than 25 million pieces of content since the start of the pandemic under the policy.
As Meta looks to possibly ease back on COVID-19 misinformation policies, the company has faced criticism from advocacy groups and Democrats that it has not done enough to combat misinformation about the virus or vaccines.
Copyright 2023 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed. Regular the hill posts