Battle over COVID-19 misinformation goes to courts

Greg Nash

As big tech companies clamp down on coronavirus misinformation and online conspiracy theories, a growing number of groups that critics describe as anti-vaccine activists are asking the courts to intervene.

Tech giants like Facebook, Amazon and Google have been hit with lawsuits after taking bolder steps against dubious claims that have sown confusion and distrust around public health guidance as the death toll from COVID-19 has continued to climb.  

“The pandemic has marked a sea change with respect to the attitude major platforms take towards content moderation, including around misinformation,” said Michael Karanicolas, a Wikimedia fellow at Yale Law School’s Information Society Project.

Legal experts say that while the lawsuits appear far-fetched, they highlight the escalating stakes over the fight for truth and accuracy. That battle that has taken on added urgency against the backdrop of a pandemic that has killed more than 450,000 in the U.S. and depends on widespread vaccination for its eventual eradication.

The threat that misinformation poses to American democracy also spilled into public view in shocking fashion last month when supporters of former President Trump, many of whom had embraced his false claim that the election had been stolen from him, stormed the Capitol on Jan. 6.

“With the attack on the Capitol following weeks of misinformation on election fraud, this issue is becoming even more relevant,” said Wes Henricksen, a law professor at Barry University. “People are beginning to ask, ‘Is it really such a good idea to allow politicians, media, and corporations to disseminate misinformation without any limits whatsoever?’ ” 

But the roiling debate over what the U.S. can do about its national “infodemic” has left experts divided.

Rodney Smolla, a law professor at Widener University, said the First Amendment strikes the right balance between legitimate speech rights and illegal forms of expression.

“Existing First Amendment principles provide robust protection for freedom of speech,” he said. “But that protection is lost when speakers cross certain lines, including incitement, threats, defamation, and the aiding and abetting of crime.”

Others say the current mix of law and policy is inadequate and that tech platforms should step up their efforts.

“I do think the platforms can and should do a heck of a lot more in terms of using their editorial discretion to filter out blatantly and objectively false content and sources,” said Phil Napoli, a public policy professor at Duke University.

But increasingly tech companies are facing political and legal blowback for taking steps to moderate content on their platforms — as do the politicians who spur them on to take bolder measures.

In spring 2019, Rep. Adam Schiff (D-Calif.) sent letters to Amazon, Facebook and Google to request that they remove or downgrade anti-vaccine content on their platforms, citing concerns that its widespread dissemination could fuel unfounded fears about childhood vaccination. 

Schiff also asked during a June 2019 hearing whether Congress should consider narrowing the immunity from liability that tech companies enjoy under a 1996 federal statute as a means to combating misinformation.

Later, Amazon dropped the videos “Vaxxed” and “Shoot ‘Em Up: The Truth About Vaccines” from its streaming service and booted groups from its affiliate program. Facebook began affixing links to trusted sources like the World Health Organization and the National Institutes of Health on health content it found questionable. Twitter began employing health disclaimers around this time as well.   

As a result of these measures, a conservative doctors’ group sued Schiff. The group, the Association of American Physicians & Surgeons (AAPS), claimed that Schiff’s successful campaign to get Amazon, Facebook and Google to suppress anti-vaccine content on their platforms caused them to discriminate against the AAPS.

The group also has promoted discredited medical theories and opposes mandatory vaccines. Some of its top members, including its Executive Director Jane Orient, have challenged the overwhelming consensus view among scientists that vaccines do not cause autism, with Orient telling The Atlantic in an interview last year, “I think that the definitive research has not been done.”

A federal judge in Washington, D.C., this week dismissed the AAPS case, finding that the group lacked a legal right to sue. It also found that Schiff’s outreach to the tech companies was protected under the Constitution’s Speech or Debate Clause, which gives lawmakers immunity from most lawsuits related to their legislative duties. The AAPS, which insists that it is not “anti-vaccine” but rather supports enhanced vaccine awareness, vowed to appeal to the D.C. Circuit.

In a similar case, the anti-vaccine group Children’s Health Defense sued Facebook as well as third-party fact-checkers hired by Facebook to police its platform for health misinformation. The lawsuit accuses Schiff of “teaming up with Facebook to censor speech.”

That case is pending in a Northern California federal court. An attorney for Children’s Health Defense noted that the group on Friday would be filing its opposition to a motion to dismiss filed by Facebook and its CEO Mark Zuckerberg. (Disclosure: The reporter of this story is a former employee of Poynter, one of the defendants to the lawsuit.) 

Another anti-vaccine group, the Informed Consent Action Network, filed a lawsuit against YouTube and Facebook making similar allegations about the tech groups’ participation in Schiff’s “war on disfavored speech.”

Attorneys for the Informed Consent Action Network, Facebook and Google did not immediately respond when invited to comment. 

Experts told The Hill that lawsuits filed against social media for their refusal to host content on their platforms were nearly assured to lose on various grounds. Perhaps most crucially, social media platforms are private companies and therefore are not bound in the way government entities are by the Free Speech Clause of the First Amendment.

“These cases have no realistic chance of winning in court,” Smolla said.

Henricksen, of Barry University, predicted that the U.S. courts would soon see more litigation coming from the other direction: lawsuits filed by those who have been harmed by coronavirus misinformation.

He cited a litany of historical examples where misinformation created tangible harm: tobacco misinformation about the health dangers of smoking, opioid misinformation about the painkillers’ addictiveness, sugar misinformation about health dangers of high fructose corn syrup and fossil fuel misinformation about global warming. But the coronavirus “infodemic” is distinct from these prior examples in one critical respect, he said.

“In almost all cases, the harm caused by misinformation in the past was slow moving. It took, and takes, years or decades to manifest,” Henricksen said. “In the case of COVID-19 misinformation, however, the effect on public health is immediate. It can be seen and tracked statistically in real time.”

He added: “We are only seeing the very beginning of the legal fights on the coronavirus misinformation front.”

Tags Adam Schiff Coronavirus COVID-19 Donald Trump Mark Zuckerberg

Copyright 2023 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed. Regular the hill posts

Main Area Bottom ↴

Top Stories

See All

Most Popular

Load more