The views expressed by contributors are their own and not the view of The Hill

The risk of new military technologies must be properly assessed

Greg Nash


Failure to think about new military technologies makes it more likely that these systems could be used in reckless ways and, if used, could lead to unplanned escalation. It is important to take this problem seriously, because we are in the early stage of a long- term arms race with advanced technologies such as artificial intelligence (AI), hypersonic missiles, cyber weapons, drones and the like. 

In this regard, the Biden administration’s recent review of drone and cyber attack policies is a welcome development. So far, these particular technologies have been used against terrorists, insurgents and in other low-intensity conflict situations. Here, the risk of an “explosion” in escalation violence is low, since the enemy doesn’t possess weapons to do this. At most, mishaps can lead to unnecessary civilian casualties or killing the wrong individuals. Indeed, current reviews focus on exactly this criterion — namely, collateral damage.

This article argues for an added criterion: the likely impact on unwanted counter-escalation. The reason is that drones, cyber, AI, hypersonic missiles, anti-satellite attack and other advanced technologies are central to U.S long-term competition with China and Russia. Some of these technologies have spread to North Korea and Iran. These two nations, for example, already are major threats in cyber war, and both operate armed drones.

Here we see the new escalation problem facing the United States. Using these technologies against terrorists and insurgents, or to disrupt a weak power such as Iran’s uranium enrichment, the chances of a large eruption in violence by an enemy’s response are low. Terrorists and insurgents lack the weapons to strike back at the United States in a meaningful way.

But used against China, Russia or North Korea, the risk is altogether different. Even in a limited war with conventional weapons the new technologies could become highly destabilizing. More, these are nuclear-weapon states. In peacetime, China, Russia and North Korea do a good job of controlling their nuclear forces. There are no reports of accidents or unintended missile launches by any of them. They are cautious and disciplined when it comes to these weapons. But this is in peacetime — caution and discipline are easy because there are no stresses on their leadership or military command systems.

Crises and limited wars aren’t like that. In an intense crisis or limited war, all bets are off — or, at least, all bets need to be recalculated. And this is the point: Any policy review needs to assess the escalation potential of their use. Basing reviews on the way they are used against terrorists and insurgents overlooks the critical difference that heavily armed nations bring to the problem. They can strike back at the United States or its allies, and this ability calls for a different type of review that goes beyond collateral damage. It must focus on the likelihood of counter-escalation.

To control escalation, the United States must take account of the fog of war not only in our own forces, but in the enemy forces as well. Against a nuclear weapon state, the United States isn’t simply taking out targets; it is playing a larger game of mutual risk- taking. Any review that ignores this point misses the essence of the decision facing U.S. leaders. 

Today’s use of new technologies such as drones and cyber strikes against terrorists and insurgents in many ways has become normalized: It has turned into a business-as-usual application of U.S. reconnaissance and firepower to destroy discrete, isolated targets. The risks of an escalation disaster are low; there are tight controls, yes. But these have to do with collateral damage, not counter escalation by the other side. This normalization of advanced technologies can mislead us to believe that these attacks are “safe” — that not too much can go wrong. There is experience to back up this expectation.

But against a serious military power such as Russia, China or North Korea, this experience doesn’t apply. We’re in a completely different situation with a much greater risk of unwanted escalation to a larger war that neither side likely wants.

Let’s take an example to better see the risk. A U.S. attack against communications could destroy the enemy’s ability to control their own forces. This could make their leaders believe that an all-out attack was under way, designed to eliminate their government or national existence. If they wanted to terminate a limited war, they might not be able to do so because their communications to their own forces were wiped out. If they believed that our attack was “total” they would have little reason not to fire their own devastating counter strike, including nuclear weapons.

Any official review of advanced technologies should consider this risk. It should be highlighted in military plans and briefings to civilian leaders. One way to do this is to require the military to clearly describe any strikes against enemy communications, together with an assessment of how these could affect escalation. This review should cover cyber attacks as well as air and missile strikes. 

There will be objections to this proposal. Some will say that it amounts to micromanagement by civilians. Others might argue that it’s impossible to assess the consequences of attacks on enemy communications. However, American leaders need to know what they’re getting into by using advanced technologies in a way that could lead to massive destruction counters by a foe.  

As to the difficulty of making assessments, exactly the same argument was made in the Vietnam War against requiring air strikes to include estimates of civilian casualties. Today, it would be unthinkable for the Pentagon to present a plan that didn’t include collateral damage estimates.

The current reviews by the Biden administration should be extended to advanced weapons beyond drones and cyber attacks. Most importantly, these reviews should add escalation potential to the existing criteria of collateral damage. As new technologies spread, the world is becoming riskier and more dangerous. Our policies must keep up with this perilous development. 

Paul Bracken is a professor of management and political science at Yale University.

Tags A.I. Artificial Intelligence Cyberattack drone attack Nuclear weapon

Copyright 2023 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed. Regular the hill posts

Main Area Bottom ↴

Top Stories

See All

Most Popular

Load more