The views expressed by contributors are their own and not the view of The Hill

Are we headed toward a dystopian online world? Supreme Court must prevent it 

iStock

What if Yelp couldn’t lead you to reliable yet negative restaurant reviews because of potential defamation lawsuits? What if ZipRecruiter couldn’t use algorithms to effectively connect you to a prospective employer? And what would YouTube be like if it couldn’t suggest relevant “up next” or autoplay videos tapping directly into your past interests? Social media platforms would be incredibly inefficient, and sometimes useless, if this were to happen.   

These platforms are designed to helpfully link people to germane content by using algorithms that sort through, as The Wall Street Journal recently put it, “the deluge created by the online masses.” Should this change, a dystopian online world might well arise, though it certainly doesn’t need to. It all depends on how the U.S. Supreme Court interprets a federal statute called Section 230, which was argued at length on Feb. 21 in Gonzalez v. Google.  

Adopted by Congress in 1996 “to promote the continued development of the internet and other interactive computer services,” Section 230 generally protects platforms from liability for harms caused by content that others –– not platforms –– create. In brief, it typically immunizes platforms from lawsuits stemming from third-party content. It also protects platforms when they remove, in good faith, illegal or “otherwise objectionable” content.  

The issue the Supreme Court faces in Gonzalez, however, does not focus on platforms hosting or removing others’ content; instead, it centers on whether Section 230 protects platforms when they make recommendations for content through algorithms. 

The plaintiffs claim the statute does not provide immunity for algorithmic-generated recommendations. They assert “that YouTube had recommended ISIS” videos that “aided and abetted ISIS” terrorists in the killing of a relative in 2015. They sued Google, which owns YouTube, under a federal anti-terrorism law. A divided U.S. Court of Appeals for the Ninth Circuit, however, concluded that Section 230 protects algorithmic recommendations like those used by YouTube.          

The Supreme Court should uphold the Ninth Circuit’s ruling when it hands down its decision, likely this summer. A ruling otherwise — despite the emotionally charged facts of Gonzalez — would destroy the functionality and efficiency of social media platforms in how they organize, display and present third-party content to users via algorithms.  

That’s partly because, as Justice Clarence Thomas observed during argument, Gonzalez involves “the neutral application of an algorithm” that works for all varieties of videos “across the board,” not just for ISIS videos. In other words, a ruling for Gonzalez will affect the delivery of all types of recommendations or, as Justice Thomas dubbed them, “suggestions.” 

Section 230 immunity is vital for social media platforms. As ZipRecruiter explains in a friend-of-the-court brief it filed in Gonzalez, without Section 230, companies like it “might regularly face lawsuits in connection with third-party job postings and resumes, whose contents they do not generate, and which they cannot possibly screen through human review.”  

Yelp concurs. The consumer-review platform notes in its brief filed in Gonzalez that it “deploys recommendation software on its platform” as “a key tool to cultivate useful content for the public.” Yelp has successfully relied on Section 230 for years to fend off expensive litigation against businesses that have been upset with consumer reviews. Narrowing Section 230, Yelp avers, “would trigger an onslaught of suits that would bog down internet service providers in endless litigation.”

As Lisa Blatt, Google’s attorney, explained during oral argument, the internet would have suffered a “death by a thousand cuts” from lawsuits without Section 230. To avoid such lawsuits in the absence of Section 230 immunity, platforms would likely err on the side of caution and remove any controversial content. As attorney Robert Corn-Revere writes, “If there is the slightest chance you might shoulder legal accountability for what you let people post on your platform, you are not going to risk it.”   

Just how important will the Supreme Court’s interpretation of Section 230 be? One objective indicator is the large number of non-party briefs –– 79 –– that were filed to influence the court’s decision.  

Upholding the Ninth Circuit’s decision will keep social media platforms valuable for users. It also will let Congress –– the body that fashioned Section 230 in the first place — adopt any carefully calibrated tweaking to protect a useful online marketplace of ideas while recognizing that sometimes a recommendation can go wrong. That is how separation of powers in government is supposed to work. Or, as Justice Elena Kagan self-deprecatingly remarked about the Justices’ knowledge of internet algorithms, “We’re a court. We really don’t know about these things.”

Clay Calvert, J.D., Ph.D. is professor emeritus at the University of Florida (UF). He held a joint appointment as a professor of law at the Fredric G. Levin College of Law and a Brechner Eminent Scholar in Mass Communication in the College of Journalism and Communications. Specializing in First Amendment and media law, Calvert has published more than 150 law journal articles on topics affecting free expression, and he is lead author of “Mass Media Law” (22nd ed. 2023, McGraw Hill).

Tags algorithms Clarence Thomas Defamation defamation lawsuit Elena Kagan Google Internet internet regulation Section 230 Section 230 of the Communications Decency Act Separation of powers Social media social media platforms social media regulations Supreme Court of the United States Yelp YouTube

Copyright 2023 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed. Regular the hill posts

Main Area Bottom ↴

Top Stories

See All

Most Popular

Load more