The Meta Oversight Board is urging the owner of Facebook and Instagram to overhaul its “cross-check” program that delays enforcing content moderation measures for politicians, celebrities and other high-profile users.
The program allows for posts that would otherwise be “quickly removed” to remain posted and potentially cause harm based on a system that appears “structured to satisfy business concerns,” the board said in a 57-page advisory published Tuesday.
The board, made up of academics, experts and civic leaders, calls for Facebook parent company Meta to make the program more transparent and prioritize expression that is important for human rights.
The recommendation followed a roughly 13-month review spawned after the program was revealed as part of a Wall Street Journal report that included disclosures from Facebook whistleblower Frances Haugen.
The cross-check system provides an additional layer of review for content on Facebook and Instagram for certain accounts, guaranteeing a human review. The content remains fully accessible on the platform before it is subject to the additional review process, known as the Early Response Secondary Review, even if a first assessment is that it violates platform rules.
Meta told the board on average it can take more than five days to reach a decision on content from users on its cross-check lists. The board found Meta has taken as long as seven months to reach a final decision on content subject to the cross check.
The “delayed enforcement of violating content is a significant source of harm” under the program, according to the board’s recommendation.
“In sum, the Board finds that while Meta characterizes the cross-check as a program to protect vulnerable and important voices, it appears to be more directly structured and calibrated to satisfy business concerns,” the board said.
Meta launched a “general secondary review” process, fully implemented in early 2022, for content posted by any users. Meta uses an algorithm called cross-check ranker to decide what content to send for a general secondary review.
Julie Owono, executive director of Internet Without Boarders and a member of the Meta Oversight Board, said the change showcases how the program is already evolving.
“The board is also making recommendations with regards to that program to help the company strike a better balance between business priorities, which are important, and striking the right balance with human rights priorities, too,” Owono said.
The board is urging Meta to create a more transparent process — including setting out a public criteria for inclusion in its cross-check lists and allowing users who meet it to apply to be added.
Some categories of entities protected by cross-check, including state actors, political candidates and business partners, should also have their accounts “publicly marked,” the board recommended.
And if users included “due to their commercial importance” frequently post violating content, they should no longer “benefit from special protection,” the board said.
Meta should also measure, audit and publish key metrics about the cross-check program to determine if it is “working effectively.”
The board also calls for Meta to “prioritize expression that is important for human rights, including expression which is of special public importance.” Posts from these users should be reviewed in a “separate workflow” to not compete with Meta’s business partners for limited resources, the board said.
“Content that comes from Afghanistan, for instance, shouldn’t take 17 days, as we’ve seen, to be reviewed by the moderation systems of Meta,” Owono said.
Additionally, the board said content identified as violative during Meta’s first assessment that is “high severity” should be removed or hidden while further review is taking place, the board said.
Unlike decisions on specific content or accounts, the Oversight Board’s policy recommendations are nonbinding, meaning Meta does not have to apply the changes the board is recommending.
The company does, however, have to publicly respond. Even if Meta were to say it will not or cannot implement certain recommendations, the board and general public will glean information about the limitations that are keeping the company from implementing them, Owono said.
The recommendation was also crafted after receiving input from the public, she said. The board received nearly 90 public comments on the policy advisory, including from a group of roughly a dozen Democrats led by Rep. Adam Schiff (D-Calif.) who said the program should “treat all major candidates for office and office holders equally. “
The recommendation comes as Meta is set to decide whether to reinstate the account of former President Trump, who has announced a new bid for the nation’s highest office. The company said it would reevaluate the decision in January, two years after Trump’s suspension over content found to incite violence around the Jan. 6, 2021, riot at the Capitol.
If Trump is allowed back on and placed on the cross-check list again, the board hopes Meta will take the recommendations about prioritizing human rights and potential risks into consideration, Owono said.
She also said the board’s advisory could serve the broader social media industry, beyond the platform’s Meta controls. The recommendation comes as Twitter, now under control of new CEO Elon Musk, is peeling back certain content moderation decisions including reinstating Trump’s account.