Facebook is asking the quasi-independent Oversight Board for guidance regarding the platform’s “cross-check” content moderation system for high-profile users after a recent report claimed the system lets some of those users break the platform’s rules.
Facebook requested the Oversight Board’s guidance on Tuesday in the form of a Policy Advisory Opinion — about a week after the board requested Facebook provide it with “further clarity” about information relating to the cross-check system that was previously shared with board members.
Facebook says its cross-check system was created to prevent “potential over-enforcement mistakes.” But a recent Wall Street Journal report cited documents showing Facebook’s cross-check program included at least 5.8 million users in 2020, and at times has allegedly protected public figures whose posts contained harassment or incitement of violence.
Facebook is asking the board for guidance on the criteria it uses to determine what content is prioritized for a secondary review, as well as how the company should manage the program.
“We know the system isn’t perfect. We have new teams and resources in place, and we are continuing to make improvements. But more are needed. The Oversight Board’s recommendations will be a big part of this continued work,” Facebook’s Vice President of Global Affairs Nick Clegg said in a blog post.
The Oversight Board last week requested further information from Facebook about the program in light of the recent Journal report. The board underscored its request by noting that Facebook in the past has withheld some information the board has asked for on the topic, particularly when reviewing the case regarding whether to uphold a ban on former President Trump’s account.
“To address this, we asked Facebook to explain how its cross-check system works and urged the company to share the criteria for adding pages and accounts to cross-check as well as to report on relative error rates of determinations made through cross-check, compared with its ordinary enforcement procedures. In its response, Facebook provided an explanation of cross-check but did not elaborate criteria for adding pages and accounts to the system, and declined to provide reporting on error rates,” the board wrote.
The scrutiny on Facebook’s cross-check program is only part of the backlash the social media giant has faced after the Journal published a series of reports about Facebook, raising questions about content moderation practices and Facebook products’ impact on teen mental health.
Any recommendations the Oversight Board makes regarding the system, Facebook could choose to implement or ignore. Unlike the binding decisions regarding individual content decisions, the company has not committed to putting in place all policy recommendations provided by the board.