Advocates shine light on Facebook, Twitter takedowns

Two organizations are asking people around the world to help them understand how social networks like Facebook and Instagram police their users.

The Electronic Frontier Foundation (EFF) and Visualizing Impact are behind a website where users can report when a social network takes down their content or otherwise blocks activity on the platform.

An online form allows users to submit reports about six different websites, including Facebook, Twitter and Instagram. Users can tell the project when their accounts are suspended, for instance, or when everything from a Facebook event to a Twitter ad is deleted.

{mosads}“The public service value, to me, is making users aware of just how much companies control our everyday speech,” said Jillian C. York, the Director for International Freedom of Expression at EFF, who launched the site alongside Visualizing Impact co-founder Ramzi Jaber.

The genesis of the idea can be traced to a conference four years ago, where York met Jaber.

A month before they met, the British band Coldplay found itself mired in controversy over a video they posted on Facebook for a song called “Freedom for Palestine.” The link disappeared from their page after some fans were angered by the post.

Some assumed the band had taken it down. But it turned out Facebook had taken down the video because a number of people had reported it as abusive. York says it was clear the link had been flagged by an algorithm designed to vet content.

“If a staffer had seen that link, they would have known that it wasn’t abusive,” she said.

She and Jaber launched an early version of their website in 2012, hoping to document and understand the process that led to the social media takedowns. In 2014, the duo applied for and won a grant from the Knight Foundation which allowed them to create the more ambitious website launched on Nov. 19th.

Visitors to the website can use step-by-step guides on how to appeal a decision made by a social network. They worked with the companies to make sure the information is accurate, York said.

“It‘s basically trying to explain to users how they can help themselves to get their content back,” shesaid, “which is something that the companies, you know, they offer these appeals systems but they don’t advertise them, they don’t make it clear to users that this is a possibility.”

Some of the social networks do provide some information on how often — and sometimes why — they take down user-posted content.

Facebook, Google and Twitter all issue transparency reports showing how many requests they’ve received from governments to take content down. In some cases, the reports also include the number of requests claiming a post violated copyright law.

Lumen, a Harvard-affiliated project, also maintains a repository of takedown notices sent to many websites, including Google, Twitter and Reddit, so researchers, users and reporters can see exactly who is demanding that content be removed and for what reasons.

Those questions may become more relevant as the platforms coax users and publishers into posting more videos and, in the case of Facebook, news articles.

York said the project hopes to focus on instances where social networks take down content because they believe it violates their terms of service, the lengthy rules of the road that each user signs off on to stay on the platform, rather than because a government believes it violates the law.

“Nobody’s really looking at the terms of service takedowns,” she said.

The site’s backers also hope to translate its contents so that it will one day be accessible in languages other than English. The project’s ultimate goal is not just to understand how companies restrict what is posted on their platforms, according to York, but to push them to better serve their users.

York said that EFF and others have strong relationships with the companies, even if the project may create some tension between the two sides. She expressed hope that attention from the project could nudge management at some of the companies to improve the appeals process and potentially change their underlying policies on taking down content.

Though the companies are admittedly protective of their rules, they aren’t immovable.

York pointed to Facebook’s recent change to how it enforces its “real name” policy, which required individuals to use their “authentic identities” on their profiles. The company made changes because certain users who do not want to use their legal name, like some transgender people or drag performers, found themselves shut out of the platform or going by a name on Facebook that they did not use.

“We’re trying to just make users’ lives better,” York said. “We’re trying to improve the user experience in a way that the companies haven’t been willing to do, and if we can put pressure on companies to do that, then that’s great.”

Tags Electronic Frontier Foundation Facebook Instagram Reddit Twitter

Copyright 2023 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed. Regular the hill posts

Main Area Top ↴

THE HILL MORNING SHOW

More Technology News

See All
Main Area Bottom ↴

Most Popular

Load more