Meta urged to better track enforcement of Holocaust denial content 

FILE - This photo shows the mobile phone app logos for, from left, Facebook and Instagram in New York, Oct. 5, 2021.
AP Photo/Richard Drew, file
FILE – This photo shows the mobile phone app logos for, from left, Facebook and Instagram in New York, Oct. 5, 2021.

The Meta Oversight Board is urging the tech company that operates Facebook and Instagram to take steps to update how it tracks Holocaust denial and its enforcement of the ban on the false content, according to a decision announced Tuesday.  

The Meta Oversight Board, which is run independently of Meta and funded through a grant by the company, overturned the tech company’s decision to leave up an Instagram post that spread false and distorted information about the Holocaust, along with the policy recommendation.  

The post includes a meme of the “SpongeBob SquarePants” character Squidward and lists false claims under a speech bubble titled “Fun Facts About The Holocaust,” according to the board. Meta took the post down in August after the board announced it selected the case to review.

When the board took up the case for review, Instagram allowed the post, which questioned the number of victims of the Holocaust and the existence of crematoria at Auschwitz, to remain on the platform.  

The content was originally posted on Instagram in September 2020, one month before Meta updated its hate speech guidelines to explicitly prohibit Holocaust denial.  

In addition to overturning the decision on the content, the Oversight Board recommended Meta take steps to create a system to label enforcement data that would help the company keep track of Holocaust denials posted and its enforcement of the ban.  

When it first asked Meta how effective its moderation systems are at removing Holocaust denial content, Meta “was not able to provide the Board with data,” according to the decision posted by the board.  

As part of the review of the case, the board found the COVID-19 automation policy, which was created at the start of the pandemic due to Meta’s reduction in human reviewer capacity, was still in place as of May. The policy led to some reports of the Holocaust post being automatically closed, the board said.  

The Oversight Board is asking the company to “publicly confirm” whether it has fully ended the COVID-19 automation policy. The board questioned why a user’s appeal of Instagram’s decision to leave the post up in May received the auto-closed message, shortly after the World Health Organization and the U.S. declared COVID-19 was no longer a public health emergency.  

The post at the center of the case was reported six times for hate speech, four of which were made before the policy update to ban Holocaust denial, according to the board’s case decision.

Two of the six reports led to human reviews, the others were reviewed by automation and either assessed as non-violating or “auto-closed” based on Meta’s COVID-19 automation policies.  

The more recent of the two human-reviewed cases was made in May. The user who reported the content appealed Meta’s decision to leave the post up, but the appeal was auto-closed, according to the board.  

Meta said it welcomed the board’s decision in a statement.

“The board overturned Meta’s original decision to leave this content up. Meta previously removed this content so no further action will be taken on it,” it said.

“In accordance with the bylaws, we will also initiate a review of identical content with parallel context. If we determine that we have the technical and operational capacity to take action on that content as well, we will do so promptly,” the company added. 

Tags Facebook Facebook Holocaust denial Instagram Instagram META Meta misinformation

Copyright 2023 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed. Regular the hill posts

Main Area Bottom ↴

Top Stories

See All

Most Popular

Load more