Tech companies crack down on hate speech after Charlottesville
Technology companies are confronting difficult questions about whether they should use their power to silence hate speech online.
Several tech companies have cracked down on white supremacist groups in the wake of a rally in Charlottesville, Va., where a woman was killed by a car driven into a counter protest crowd.
Facebook and Twitter took down the accounts of hate groups from their platforms. GoDaddy and Google banned neo-Nazi website the Daily Stormer from their domain hosting services, effectively taking it off the internet.
But many in the tech world wonder where they should draw the line when policing content, fearing the crackdown could lead to censorship.
{mosads}
In an letter to employees published by Gizmodo, the CEO of Cloudflare, a cybersecurity service provider, mulled the implications of effectively keeping certain voices off the internet after they also banned the Daily Stormer.
“Literally, I woke up in a bad mood and decided someone shouldn’t be allowed on the Internet,” Cloudflare’s CEO Matthew Prince wrote. “No one should have that power.”
“[An employee] asked after I told him what we were going to do: ‘Is this the day the Internet dies?'” Prince wrote in the letter. “He was half joking, but I actually think it’s an important question.”
Prince’s email addresses the schism between free speech and hate speech that internet companies are grappling with. Aside from Prince acknowledging the tough spot technology companies are in, the industry has largely avoided publicly talking about it. Internally though, they’ve been working for a long time to tow the delicate line between offensive speech and dangerous speech.
Employees at major technology firms say that they have felt the pressure to take down certain types of content, but in some cases have been hesitant. When GoDaddy and Google banned Daily Stormer from using their services, Twitter waited two days before banning Daily Stormer accounts.
“No one wants to be the last company who banned a neo-Nazi,” said one employee of a major technology firm. The employee noted that at the same time, social media platforms often play wait-and-see, watching who gets banned from other websites.
“On most issues, [social media companies are] enforcing their own rules,” the employee with knowledge of the matter said. “They don’t have conference calls about whether to suspend Richard Spencer’s account at the same time. But they watch and see what the other companies do and it affects their decisions.”
Decisions on what content stays and goes can become so fraught that they reach the top levels of the company. For example, at Twitter, CEO Jack Dorsey occasionally steps in to make the final call on removals and suspensions.
A Twitter spokesperson declined to comment on this.
At Facebook, the Wall Street Journal reported in October that Mark Zuckerberg similarly made a judgement call on content when he decided to not remove posts made in 2016 by then-presidential candidate Donald Trump that may have violated the company’s community standards.
A Facebook spokesperson declined to comment on whether or not Zuckerberg gets involved in decisions over content removal.
In the days after violence in Charlottesville, both Twitter and Facebook along with GoDaddy, Google and a list of other technology companies banned white supremacists from their platforms. Paypal also removed prominent white nationalists involved in the Charlottesville rally; OKCupid kicked off a high-profile white supremacist; Spotify gave white supremacist artists the boot.
For some of these companies, the crackdown is a departure from their original ethos of being an unrestricted platform for ideas, even problematic ones.
In 2011, Twitter fashioned itself as the “free speech wing of the free speech party.”
“We don’t always agree with the things people choose to tweet, but we keep the information flowing irrespective of any view we may have about the content,” Twitter co-founder Biz Stone wrote that same year.
Twitter policed hateful content at the time, but was considered to be far more hands-off than it is now. The company has since made moves to curb rampant abuse and hate speech from its platform, and particularly content that incites violence.
Reddit dealt with similar growing pains after its long commitment to being an open platform even for socially repugnant ideas.
Some think that this movement to police content, while lauded by progressive and nonpartisan groups alike, could pose concerns if it isn’t handled carefully.
“We push companies to uses these algorithms, but they’re not perfect. You get false positives,” a tech firm employee said, noting that even reviewing the results leaves the process open to human error. “People think this is black and white, but we’re really dealing with shades of gray.”
Political heavyweights have also weighed in. Conservative Tucker Carlson argued during his Fox News show on Monday that “Google should be regulated like the public utility it is to make sure it doesn’t further distort the free flow of information.”
Sen. Ted Cruz (R-Texas) similarly expressed to Axios that he’s worried about “large tech companies putting their thumb on the scales and skewing political and public discourse.”
Sonny Sinha, a tech policy expert and former Obama administration staffer, argued that an inconsistent process of content removal without proper checks could have damaging effects.
“In some weird flipped reality, innocent groups could be targeted for their speech,” said Sinha.
Sinha stressed that he disagrees with white supremacist and neo-Nazi groups, but is worried that without a consistent set of standards and dialogue on removing content, it would be possible for groups to clamp down on less harmful speech on the internet in the future.
“We need to have a forum and mechanism to have these conversations,” Sinha added.
Ultimately though, even opponents of white supremacists point out that these company’s platforms aren’t owned by the government and note that they can do business with whoever they like.
“Spotify has the right to take [white supremacists] off their platform. They’re their own corporation,” Luther Campbell hip-hop artist and former frontman of 2 Live Crew told The Hill. 2 Live Crew’s music faced calls for censorship in the 80s over its suggestive content.
“If Spotify was owned by government it would be different,” Campbell said.
Copyright 2023 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed. Regular the hill posts