Facebook failed to block ads containing death threats to election workers: report

(Photo by KIRILL KUDRYAVTSEV / AFP via Getty Images)

Facebook failed to block 15 out of 20 ads containing death threats to election workers submitted by researchers to test the tech giant’s enforcement, according to a report released Thursday. 

An investigation by Global Witness and the NYU Cybersecurity for Democracy team found the Meta-owned platform approved almost all of the ads with hate speech the researchers submitted on the day of or day before the midterm elections. 

The ads tested included real examples of previous threats made against election workers, including statements “that people would be killed, hanged or executed, and that children would be molested,” according to the report. The content was submitted as ads in order to let the team schedule when they would be posted and remove them before they went live. 

Facebook approved nine of the 10 English-language ads and six of the 10 Spanish-language ads, according to the report. 

A spokesperson for Meta said in a statement that the “small sample of ads” is “not representative of what people see on our platforms.” 

“Content that incites violence against election workers or anyone else has no place on our apps and recent reporting has made clear that Meta’s ability to deal with these issues effectively exceeds that of other platforms. We remain committed to continuing to improve our systems,” the spokesperson said. 

Some of Meta’s ad review process includes layers of analysis that can take place after an ad goes live, meaning there’s a chance the ads that were approved as part of the test could have been removed later if they were not pulled by the research team.

The Global Witness and NYU Cybersecurity for Democracy team found that Google-owned YouTube and TikTok performed better at enforcing their policies in the test of the ads containing death threats. 

After the team submitted the ads to TikTok and YouTube, both platforms suspended its accounts for violating their policies, according to the report. 

Global Witness and the NYU Cybersecurity for Democracy team urged Meta to increase its content moderation capabilities and to properly resource content moderation in all countries in which it operates. They also called on Meta to disclose the full details about the intended target audience, actual audience, ad spend and ad buyers of ads in its ad library.

Tags Facebook Meta

Copyright 2023 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed. Regular the hill posts

Main Area Top ↴

More Technology News

See All
Main Area Bottom ↴

Testing Video

ASR RAW Boys Lacrosse: Coronado 8, Poway 6

ASR RAW Boys Lacrosse: Coronado 8, Poway 6
ASR RAW Girls Lacrosse: Coronado 15, Cathedral ...
Former Torrey Pines teammates take home another NCAA ...
Boys Lacrosse: Torrey Pines 11, Bishop's 9
More Videos

Top Stories

See All

Most Popular

Load more