Meta, the parent company of Facebook and Instagram, is updating its controversial cross-check program that uses different content moderation measures for certain politicians, celebrities and other high-profile users, the company said Friday.
The tweaks, which include refining criteria for the program and aiming to reduce delays on content review, are part of the company’s response to requested updates recommended by the Meta Oversight Board.
Meta’s President of Global Affairs Nick Clegg said in an updated blog post the company will make cross-check “more transparent through regular reporting and fine-tune our criteria for inclusion to the list to better account for human rights interests and equity.”
“We will also change cross-check’s operational systems to help reduce the backlog of review requests and decrease the time it takes to review cases,” he wrote.
The Meta Oversight Board said in a Twitter thread that “several aspects of Meta’s response haven’t gone as far as we recommended to achieve a more transparent and equitable system,” and that the board will “continue to react to Meta’s specific responses in the days and weeks to come.”
The oversight board, which runs independently from Meta, is made up of a panel of global academics, experts and civic leaders. It is funded by an independent trust provided by Meta.
In December, the oversight board requested Meta overhaul the program, following a roughly 13-month review spawned after the cross-check program was revealed as part of a Wall Street Journal report that included disclosures from Facebook whistleblower Frances Haugen.
Unlike the board’s decisions about whether specific pieces of content should remain online or be removed, its policy recommendations are non-binding, meaning Meta can choose whether or not to implement those decisions.
In the case of cross-check, Meta said it would fully implement some recommendations, such as working to improve workflow and reduce any backlog of content under review, but on others the company said it will partially implement them.
For example, the board recommended that Meta establish clear and public criteria for users that merit additional protection from a human rights perspective and those included for business reasons. Meta said it will refine its criteria to draw those distinctions, but it has no current plans to publish more of the specific criteria to determine whether a user qualifies because doing so could make the system vulnerable to manipulation.