Nineteen years ago, the challenges of the new millennium began to show themselves in the most unspeakable of terms. The shock of a terrorist attack on U.S. soil potent enough to bring down New York’s iconic World Trade towers, leave parts of the Pentagon in the rubble, and down a jetliner in the fields of Pennsylvania continues to ripple through our health, our security culture, and our now-ingrained, if flawed, approach to disasters.
The terrorists who perpetrated the Sept. 11, 2001 attacks took the lives of 2,977 people that day, with many more later succumbing to the enduring effects of toxic exposures at ground zero. That same month, a bioterrorist placed anthrax spores into the U.S. mail system, an act that took five more lives and further paralyzed a nation, and gave outsized attention to the threat of bioterrorism.
We were not ready for either of these events. In a hasty response, the American government embarked on its largest re-organization since World War II, became embroiled in two major wars, and redefined the use of covert action. With a defense and security doctrine focused heavily on adversaries bent on kinetic impacts within the homeland, the focus of homeland readiness became weighted towards mass casualty events and preparing for what seemed like the inevitable use of weapons of mass destruction on our people. The newly-minted “public health preparedness” community became focused on preparing for anthrax or a smallpox attack and, to a lesser extent, a host of other potentially weaponized biological agents.
Federal biodefense programs pumped billions into developing pipelines and national stockpiles of pharmaceutical countermeasures in anticipation of a bioterrorist attack. New paradigms of state and local response systems were built from existing public health and emergency management systems to support this.
All of a sudden, preparedness became a national effort that became flush with resources, mandates and expectations that normally take years — even generations — to establish. It was messy, and perhaps too focused on adversarial threats rather than natural hazards, but that was our most recent experience, and the threats seemed palpable, imminent and existential.
Around 2005, the emergence of virulent strains of avian flu in Asia and the readiness failures laid bare by Hurricane Katrina resulted in still new paradigms. Whole-of-community responses were going to be required. Federal leadership and governance to coordinate complicated public health preparedness needed to evolve.
Laws like the Post-Katrina Emergency Management Reform Act and the Pandemic and All-Hazards Preparedness Act were signature legislative efforts to tackle the most recent failures. These laws further clarified agency responsibilities and expectations and created new entities such as the Assistant Secretary for Preparedness and Response at the Department of Health and Human Services.
But none of these laws sufficiently addressed the drivers of natural hazards and pandemics — climate change, injudicious land use, and structural inequalities embedded in our society. Meanwhile, the spending of the early post-9/11 years started to wane, and agencies and planners began to settle into a more balanced status quo — with mixed success in the inevitable disasters that have followed.
We fell into patterns of funding disaster preparedness without understanding whether it was having any impact. We fine-tuned legislation instead of creating strategic resources like a public health emergency fund, and a requirement to integrate climate change projections into preparedness. And we increasingly relied on easily undone executive orders and one-off emergency supplemental funding bills for disasters.
Today, COVID-19 is re-exposing old vulnerabilities and revealing entirely new ones. Some of what was put in place since 9/11 did help with the pandemic response. Nearly two decades of public health planning has helped engage the private sector and provide blueprints for communities to respond to pandemics. But mostly we were caught flat-footed.
We opted not to invest in high-risk, high reward initiatives like on-demand vaccine and treatment technologies. We ignored warning signs that diseases like this were becoming more likely to emerge in the first place. Like 9/11, Amerithrax, and Katrina, COVID-19 is yet another tragic inflection point in the arc of history for disaster management.
The failure to prevent the attacks of 9/11 has been called a failure of imagination. Perhaps it is because we typically react to the last disaster without imagining the next one. More than a signature 9/11 failure, it is a recurring theme in our approach to disasters. Now, the future is here — we need to learn from past mistakes, to heed the warnings of experts, and to act based on future threats and our growing vulnerabilities to them.
Perhaps our imperative on this anniversary of 9/11, as we endure another national catastrophe, is to ensure that the nearly 200,000 Americans who collectively lost their lives to 9/11 and COVID-19, and the many more affected by these and other disasters, are remembered in the actions we take to prevent future tragedies.
Jeff Schlegelmilch is director of the National Center for Disaster Preparedness at Columbia University’s Earth Institute, and the author of the book “Rethinking Readiness: A Brief Guide to Twenty-First-Century Megadisasters.”
Ellen P. Carlin is an assistant research professor at the Georgetown University Center for Global Health Science and Security and director of Georgetown’s Global Infectious Disease graduate program.