Advertisement
Opinion
|
Guest Column
Will we let autonomous weapons decide when to kill? | Column
A conference at USF probes the technology, the future and the ethics when weapons can be designed to kill without a button being pushed.
 
Most of us by now are familiar with remotely piloted Predator (an unarmed one is shown undergoing a maintenance check) and Reaper drones eliminating terrorists by firing Hellfire missiles with extraordinary precision.
Most of us by now are familiar with remotely piloted Predator (an unarmed one is shown undergoing a maintenance check) and Reaper drones eliminating terrorists by firing Hellfire missiles with extraordinary precision.
Published Oct. 7, 2023

The future of armed conflict will increasingly feature autonomous and semi-autonomous weapons and platforms that team up humans and machines to various degrees. Future warfare practitioners, thought leaders, policymakers and industry innovators convened last week at the University of South Florida for a conference presented by USF’s nascent Global and National Security Institute to consider the applications of types of autonomy to armed conflict and their implications.

Christopher Hunter
Christopher Hunter

The timing of the institute’s leadership timing could not have been more serendipitous. Just three weeks earlier, Deputy Secretary of Defense Kathleen Hicks announced Replicator, an autonomous weapons systems development and deployment initiative designed to counter China.

Most of us by now are familiar with remotely piloted Predator and Reaper drones eliminating terrorists by firing Hellfire missiles with extraordinary precision. Drone strike successes are many and consequential.

For example, just over one year ago, the United States killed Ayman al-Zawahari with two Hellfire missiles believed to be fired from a Reaper into the house where Zawahari then was living in Kabul, Afghanistan. Zawahari had been a deputy to Osama bin Laden, helped plan the 9/11 attacks and assumed leadership of al-Qaida after bin Laden was killed in May 2011. Zawahari had been on the FBI’s Most Wanted Terrorist List for years.

Two years earlier, Hellfire missiles fired from a drone killed Qasem Soleimani. Soleimani had been commander of the Quds Force, the part of Iran’s Revolutionary Guard with responsibility for international operations that resulted in attributable deaths of American service members. Soleimani also was aligned with Hezbollah, an international terrorist organization based in Lebanon.

In both instances, the missile-firing drones were piloted remotely, and there were zero reported noncombatant deaths. U.S. government personnel who pilot drones such as the Reaper do so from nearly anywhere, keeping them safe, and strike with such precision that the risk of collateral damage to innocent noncombatants is mitigated, sometimes to the point of total risk elimination. Thanks to extraordinary intelligence and engineering, two individuals responsible for countless deaths were taken off the battlefield with no American casualties and no innocent civilian casualties.

Global force projection that is precise and reduces risk of noncombatant casualties creates national security options that leverage intelligence in every sense of that word. Engineering, aerospace, computer science, manufacturing and electronics professionals design, test and build systems and platforms. Government professionals develop sophisticated targeting information and manage and execute time-sensitive operations to achieve desired effects.

While Reaper drones have autonomous features, truly autonomous weapons are materially different. The Department of Defense defines an autonomous weapons system as one that, “once activated, can select and engage targets without further intervention by a human operator.” A Reaper is piloted by a human to its target and a missile is fired only upon a human’s execution of a remotely activated firing mechanism. In other words, a person must still push a button. A truly autonomous weapons platform, on the other hand, is programmed to engage a target upon certain conditions, which are detected by sophisticated sensors. The key characteristic is the absence of a human to close what is referred to as the kill chain — to launch lethal force with no human in the final decision-making loop.

Spend your days with Hayes

Spend your days with Hayes

Subscribe to our free Stephinitely newsletter

Columnist Stephanie Hayes will share thoughts, feelings and funny business with you every Monday.

You’re all signed up!

Want more of our free, weekly newsletters in your inbox? Let’s get started.

Explore all your options

The role of autonomy in warfare is complex, as the Global and National Security Institute conference demonstrated. Americans have become accustomed to and support the types of drone strikes that killed Zawahari and Soleimani. But one of the most important issues with broader societal relevance discussed at the conference is how “normative conditions,” which is to say public beliefs and attitudes, shape autonomous warfare options beyond use of Reaper drones.

Lt. Col. Paul Lushenko, one of the Army’s foremost experts on autonomy in warfare and a conference speaker, observes that normative conditions are not merely about assessing top-line public support and approval but also about a more nuanced sense of legitimacy. What are the ethical guideposts for use of autonomous or semi-autonomous weapons? What degree of potential noncombatant casualty risk is acceptable? What types of constraints are in place? For civilian decision-makers who ultimately are accountable to the public for authorizing use of force, these normative conditions are analogous to the operating environment conditions military and law enforcement personnel encounter in the field: In both cases, the conditions set what is permissible and what is prohibited along a continuum.

Credit to USF’s Global and National Security Institute for convening the interdisciplinary expertise necessary to carefully consider the role of autonomy in modern conflict. Inasmuch as the conference was about force readiness, it also was about civil society readiness. The sooner and more deeply all of us consider these issues, the more able civilian and military leadership will be to leverage emerging technology to win wars and preserve peace.

Christopher Hunter served as a federal prosecutor with the U.S. Department of Justice and the U.S. Attorney’s Office for the Southern District of Florida and as an agent with the FBI.