The views expressed by contributors are their own and not the view of The Hill

Elon Musk has shattered the myth social media platforms are mere space providers  

FILE – Twitter, now X. Corp, and Tesla CEO Elon Musk poses before his talks with French President Emmanuel Macron on Monday, May 15, 2023, at the Elysee Palace in Paris. (AP Photo/Michel Euler, Pool)

Social media platforms have long argued that they are simply providers of a public forum in which others comment, and thus should not be held liable for what people say there. Elon Musk has single-handedly blown a massive hole in that bogus argument.

A new report today suggests that Elon Musk has unprecedented control over content moderation, and personally decided to re-platform Kanye West to X, formerly known as Twitter. It also reveals that Musk ordered his team to make his own posts among the platform’s most visible. The report further details how he told his engineers to tweak the feed of his top venture capitalist pal directly at his request — a feature not available to any of the millions of other users of the platform.

With these acts of direct editorial control, Musk has made clear his platform is not a neutral “public forum” and should be held to the same rules as any other newspaper, publisher or broadcast network. 

For over two decades, social media companies have hidden behind the legal protections conferred by Section 230 of the Communications Decency Act 1996. This legislation, passed before social media companies existed, was designed to make the early “interactive” web manageable. It conferred protections on early web and news sites so they did not have to bear legal responsibility for the content posted on bulletin boards or comment sections, even if they engaged in some content moderation. It was a specific law designed before anyone ever imagined a Facebook, Reddit or TikTok.

Nearly a decade later, social media took off as a business in earnest — dispensing with original content and turning the aforementioned “comments” into a business. Social media companies aggregate these posts and repackage them into a tailored news feed designed to be as entertaining and addictive as possible. By interspersing advertisements with comments, they monetize these endless newsfeeds. This simple business model has proven hugely lucrative — 98 percent of Meta’s revenues come from ads and has made Mark Zuckerberg a hundred billion dollars in personal wealth.

Hate and disinformation have an advantage in this environment. The repackaging, done by artificial intelligence, is designed to benefit the company by being as addictive as possible, exploiting psychological triggers — benefitting content that enrages or makes us want to react by posting more content ourselves.

The algorithms are also tailored to promote the owners of these companies and the values, politics and ideas that benefit them the most, as Musk has so explicitly demonstrated through his actions.  

Others have done the same. For example, Mark Zuckerberg reportedly approved an internal initiative — Project Amplify — to promote positive stories about Facebook. They are unequivocally, therefore, publishing companies, and what the user consumes is a result of decisions taken by executives for their own benefit — economically or politically. 

And yet thanks to the “get out of jail free” card of Section 230, enacted eight years before Facebook was even started, these companies cannot be held liable as publishers in any way for the hate, antisemitism, and disinformation that they push to billions. No other person or company in America is free from accountability or responsibility for its core product in such a way.

It is clear from the research my organization, the Center for Countering Digital Hate, publishes, that social media can be harmful. Extremists openly proselytize, recruit, finance and plan operations on these platforms with little intervention. Algorithms promote dangerous eating disorders and self-harm content to teenagers. Algorithms cross-fertilize conspiracist movements, giving QAnon followers anti-vaxx content and vice versa. Trolling and abuse are rife on these platforms, forcing women, minorities and LGBTQ+ people to restrict their own posts so as to avoid a torrent of abuse. 

In 2022, we gathered lawmakers from the U.S., UK, Canada, Australia, New Zealand, and the European Union to talk about how we might develop a set of laws that would allow us to hold these companies accountable.

We all agreed that social media companies quite clearly have a significant impact on our psychology, especially that of our children, our communities, our politics and even the values that underpin our democracy. At the end of the conference, we published our STAR Framework, which provided a comprehensive set of minimum standards for an effective regulatory framework that balances free speech with human rights. The framework demands Transparency of algorithms, decisions on how companies enforce their “community standards,” and how commercial advertising shapes the content it presents, which would allow for meaningful Accountability to the public. It also requires companies to take Responsibility for avoidable, predictable harms they failed to deal with, which we hope will lead to a culture of Safety by Design.  

Since that meeting, the European Union has passed a Digital Services Act, and the United Kingdom is shortly expected to pass an Online Safety Act that seeks to balance corporate rights with human and civil rights. The United States is unique in having failed to do so.  

It is time Congress stops vacillating and starts acting. Our kids’ mental health and body image, the safety of our communities, the rights of vulnerable and minority groups, and even our democracy itself demand better. Hiding behind the notion that these vast companies are simply “free speech” platforms rather than publishers who shape public knowledge to their agenda is simply untenable in the face of reality. Elon Musk, through his brazen stewardship of X, has ironically, made the best possible case for the STAR Framework than we could ever have done alone. 

Imran Ahmed is founder and Chief Executive Officer of the Center for Countering Digital Hate (CCDH).  

Tags Elon Musk Kanye West Section 230

Copyright 2023 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed. Regular the hill posts

Main Area Top ↴
Main Area Bottom ↴

Most Popular

Load more