On social media companies’ First Amendment rights, Supreme Court should leave Congress leeway to regulate
For generations, the Supreme Court has interpreted how the First Amendment’s free speech clause — “Congress shall make no law…abridging the freedom of speech” — limits the government’s authority to restrict expression. One line of First Amendment cases addresses situations in which government has tried to force private organizations to host speech that the organizations would prefer not to sponsor. Now, the high court is poised to weigh in on whether government may dictate which speech social media companies must carry on their platforms.
The justices will rule on whether Florida and Texas violated the First Amendment when they separately enacted laws requiring large social media companies to carry political speech they otherwise might choose to exclude. In consolidated cases referred to as NetChoice, the industry makes a persuasive argument that the First Amendment protects their content decisions as a form of editorial judgment.
But while the Florida and Texas laws ought to be struck down as unconstitutional, the high court needs to refrain from embracing the industry’s sweeping theory of its free speech rights — a claim so broad that it might categorically preclude more modest, narrowly tailored laws regulating social media. The justices will hear oral arguments in the NetChoice cases this winter, with a ruling expected by the end of June.
It was inevitable that disputes concerning free speech on social media platforms eventually would reach the Supreme Court, and the 2023-2024 term boasts a trio of cases presenting variations on the theme.
On Oct. 31, the court heard oral arguments in a pair of cases posing the question of when a public official’s social media account is subject to First Amendment constraints. Those cases began with lawsuits filed by individuals claiming that their free speech rights were violated when officials blocked them from posting on the officials’ Facebook pages. In yet another case, which is expected to be argued in early 2024, the justices have agreed to decide whether the Biden administration’s attempts to talk social media companies into removing what officials saw as misinformation about COVID-19 constituted unconstitutional government coercion.
The NetChoice cases could produce a ruling of broad significance because of the continuing pressure from politicians at both the federal and state levels to rein in social media companies. In Florida and Texas, Republican-controlled legislatures pushed through laws explicitly aimed at protecting the speech of conservative users whom the lawmakers claimed have been the victims of liberal bias in Silicon Valley. The Texas statute prohibits platforms from taking down content because of its “viewpoint.” Florida’s law restricts platforms’ right to remove or suppress posts by or about political candidates. Both states also require platforms to provide individualized explanations to users whose posts have been removed.
The content moderation provisions of the two laws should be evaluated using principles the Supreme Court has articulated since the 1970s about when government exceeds its authority by forcing private organizations to carry certain speech. Under this line of precedent, the court has struck down official requirements that newspapers provide space to political candidates the publications have criticized, as well as attempts to force parade organizers to include participants whose views they oppose, and even to oblige utility companies to include material in their billing envelopes to which they object.
Like these other private entities, social media platforms exercise something akin to editorial judgment when they choose the messages they project to the public. This editorial judgment deserves First Amendment protection, especially because the Florida and Texas laws reflect lawmakers’ explicit aim of promoting particular communication based on its content.
The individualized explanation provisions do not directly stop platforms from shaping the content they host; instead, they demand that the platform reveal certain factual information. The Supreme Court has evaluated disclosure requirements under a relatively deferential First Amendment standard. But even that standard, spelled out in a 1985 case called Zauderer v. Office of Disciplinary Counsel, forbids disclosure mandates that are “unduly burdensome.” Given that the largest social media platforms — like Facebook, YouTube, and TikTok — each use automated filtering systems to remove hundreds of millions of pieces of content a day, it would be extremely burdensome, and perhaps impossible, for them to provide specific explanations in each instance. Thus, the justices should strike down the Florida and Texas individualized explanation provisions.
But when explaining why the First Amendment precludes the content-based and unduly burdensome aspects of the Florida and Texas laws, the Supreme Court should take care not to suggest that social media companies are constitutionally protected against all regulation. Specifically, the justices should not construe platforms’ protection in a manner that forecloses Congress’ ability to enact regulations that are more narrowly drawn and sensitive to First Amendment concerns.
Such permissible regulations might include obligations that platforms disclose general, aggregated, and anonymized data about how and why their algorithms promote certain content that goes viral, so that vetted outside researchers can study those questions. Government could also require disclosure of information about how many human content moderators platforms employ, how those workers are hired and supervised, and how they interact with automated filtering systems. Likewise, the court should leave Congress the leeway to continue to debate and refine bills such as the American Data Privacy and Protection Act, which has garnered bipartisan support and would not impinge upon platforms’ free-speech rights.
By carefully refraining in this manner from laying down First Amendment principles for social media companies that are so expansive that they inhibit any regulation whatsoever, the justices can protect both free speech and the democratic lawmaking process.
Paul M. Barrett is the deputy director of the NYU Stern Center for Business and Human Rights, which has filed an amicus brief in the NetChoice cases elaborating on the argument in this opinion piece.
Copyright 2023 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed. Regular the hill posts