Amitai Etzioni is a professor of international relations at George Washington University and author of Foreign Policy: Thinking Outside the Box (2016). He was a senior advisor at the Carter White House. Oren Etzioni is CEO of the Allen Institute for Artificial Intelligence and a professor of computer science at the University of Washington. He was the founder/cofounder of Farecast (sold to Microsoft) and Decide (sold to eBay).
- The authors “take autonomy to mean a machine has the ability to make decisions based on information gathered by the machine and to act on the basis of its own deliberations, beyond the instructions and parameters of its producers.”
- It seems “inevitable” that autonomous systems, including bombers and fighter planes having no human pilot, will become more prevalent. “This genie has left the bottle and we see no way to put it back again.”
- For “dull, dirty, or dangerous missions,” robots will be better than humans. A long-duration sortie is an example of a “dull” mission, exposure to radiological materials, is an example of a “dirty” one, and the job of explosive ordnance disposal is an example of a “dangerous” one.
- Economic savings could be realized by military robots. Currently each soldier in Afghanistan costs $850,000 per year. By contrast, “the TALON robot—a small rover that can be outfitted with weapons, costs $230,00.”
- Aerial weapons systems will be less subject to fatigue and emotional stresses, and they will be able to endure physical strains, e.g., high-G maneuvers, better than human pilots.
- Autonomous military robots may well perform more ethically than human soldiers. They will not necessarily be programmed with a “survival instinct” and so be less likely to “shoot first, ask questions later.” They will be less subject to emotions of rage, fear, cowardice, and desires for retaliation or revenge.
- By replacing human combatants, “autonomous weapons could reduce the possibility of suffering and death.”
- In contrast, humans are better at distinguishing “who is a civilian and who is a combatant,” so autonomous weapon systems could be less able to follow an important rules of armed conflict, the “Principle of Distinction.”
- A major rule of international humanitarian law is that in the case of civilian deaths, some person must be able to be held responsible. Yet this would be difficult or impossible with fully autonomous weapons systems.
- Reaching international agreements to limit even some sorts of autonomous weapons may be an impossible challenge. Drawing a bright “red line” is difficult enough, as is achieving agreement and reasonable enforcement.
Read the full article here. Summary by Stephen Hicks, 2020