Turning a blind eye to the vulnerable is what makes a truly ‘broken’ internet
For years internet juggernauts and their devotees have claimed that safe harbor laws like Sec. 230 of the Communications Decency Act and Sec. 512 of the Digital Millennium Copyright Act are “the most important law[s] in tech” and “bedrock principle[s] of American Jurisprudence” — successfully convincing lawmakers that without the remarkably broad liability shields these remarkably stale laws provide, we wouldn’t have the internet we have today. But “today’s internet” is increasingly beset by Russian election meddling, fake news, cyber-bullying, hate speech, theft, contaminated products and many other socially damaging and illegal content that have prompted lawmakers rightly to question the scope and utility of these laws.
Some platform CEOs seemed to have gotten the memo that they face political risk, opting to change their words by acknowledging the responsibility they bear for illegal and abhorrent content on their platforms. But will their deeds match their rhetoric? Notwithstanding their longstanding intransigence, on the heels of a recent congressional hearing about internet platform filtering practices, it is abundantly clear that today’s online status quo is unacceptable — and the laws enabling it need to change.
{mosads}“That’s such a gross misreading of Section 230 it breaks my heart” Santa Clara University professor Eric Goldman lamented in a Wired article that was broadly critical of the hearing. His bereavement was triggered by questioning about whether platforms deserve CDA 230’s protection when they stop being passive conduits and actively make editorial decisions about content on their platforms — such as traditional publishers.
Perhaps it is a misreading. But that’s a reason for the law to change. And one wishes that such critics reserved more heartbreak for people who suffer because of the status quo in the online environment.
For instance, CDA 230 was recently amended to carve sex-trafficking related content from its protections. If a narrower reading of the CDA’s liability shield is incorrect, one would hope critics would have supported such reforms. Instead, critics opposed reform legislation, known as SESTA/FOSTA, which was enacted into law and signed by the president in April of this year. We did not critics complain that their “heart was broken” by the horrors of trafficked women and girls. Indeed, the plight of trafficking victims is given mere lip service. And other online activist groups are now suing to overturn the law.
The same goes for countless victims of other internet ills. What about teenagers who are dying because of opioids laced with Fentanyl purchased online? Or the crisis our democracy faces because of election meddling by Russian operatives? Or minorities who have been discriminated against for housing and job opportunities? Or women who face a barrage of sexist trolls making threats of violence?
Activists who fetishize laws governing the internet enacted in the days of the AOL.com boom offer nothing in terms of solutions to these and other problems — instead, stone-heartedly clinging to laws that are clearly failing us as a country and a people and fighting any attempt to change them tooth and nail.
The internet has provided conveniences in commerce, communication and speech. But recent events make clear, contrary to what internet evangelists have asserted, that benefit shouldn’t be used as a misguided justification for evading the values, laws, and norms we expect offline.
There’s every reason to believe online incentives can be realigned without chilling speech and expression, at least in ways that conform to long-accepted notions of acceptable speech and expression (as opposed to libertarian fever swamp definitions of permissible speech). Indeed, some scholars have published work demonstrating that in its current form, CDA 230 is itself chilling speech and expression — particularly among minorities and other at-risk communities. What’s more, economists have also demonstrated that realigning incentives through narrowing safe harbors could materially improve the online environment for all.
For the internet to continue to build on its great promise and potential as it permeates our everyday lives, we can no longer afford to pretend like it has not grown dysfunctional, providing greater allowance for harms and excesses. We can also no longer pretend our laws do not need updating to meet these challenges. By steadfastly insisting on a static framework of internet regulation and refusing to engage in reasonable dialogue about addressing the internet’s growing problems, activists are undermining its future, which they claim to cherish. More critically, they are turning a blind eye to vulnerable populations who are hurt by inaction. That’s heart-breaking, and also heartless.
Stephanie Moore is an attorney and previously served as chief counsel to the House Judiciary Committee Subcommittee on Courts, Intellectual Property, and the Internet, and as senior adviser to the Register of Copyrights, U.S. Copyright Office.
Copyright 2023 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed. Regular the hill posts