Facebook said Monday that no users reported the livestream of last week’s shooting at two mosques in Christchurch, New Zealand, as it occurred.
According to the social media giant, the 17-minute video was viewed fewer than 200 times while it was being livestreamed on its website Friday.
Facebook removed the video almost immediately after the New Zealand police contacted them about it, roughly 29 minutes after the video’s start and 12 minutes after it ended.
{mosads}Facebook and other platforms like YouTube, Twitter and Reddit have worked to remove the video, but have faced difficulty in implementing a blanket ban.
Around 1.5 million uploads of the video were blocked on Facebook, and 1.2 million of those were blocked on upload and therefore not seen by anyone, the company said Monday.
“We continue to work around the clock to prevent this content from appearing on our site, using a combination of technology and people,” Facebook vice president and deputy general counsel Chris Sonderby said in a statement.
Facebook designated both shootings in the city of Christchurch as terror attacks and the site’s standards prohibit any representation of the events as well as any praise or support.
New Zealand Prime Minister Jacinda Ardern said Tuesday that she had reached out Facebook Chief Operating Officer Sheryl Sandberg and expressed concerns people could still see the footage.
“You can’t have something so graphic and it not [have an impact] … and that’s why it’s so important it’s removed,” she said, according to the Australian Broadcasting Corporation.