22 September 2018

Balancing Effectiveness and Ethics in Future Autonomous Weapons

Doug Livermore
Source Link

Competing visions of future warfare invariably include some version of robotic fighting machines operating either alongside, or in place of, humans. Each of the world's major powers are pursuing development of such automated killers, each looking to grant their robotic minions varying degrees of autonomy. The decisions made concerning the future employment of such systems are driving today's policymaking and research/development efforts. Intent on keeping humans in the decision-making process for applying lethal force, the U.S. has focused its efforts on "autonomous wingmen" that emphasize the concept of "human–machine teaming" that is likely to result in operational systems within the next two years. By comparison, the Russian Federation is intent on eventually delegating the decisions on the use of lethal force to advanced artificial intelligence and granting its own robots nearly complete autonomy. Given the disparity in various nations’ approaches to robotic warfare, it is likely that on some future battlefield we will witness the true test of these competing visions.


The United States and the Russian Federation are pursuing vastly different approaches to the policies governing the development of robotic fighting vehicles and their eventual operational use. The American government has made it a matter of official policy that humans will always make the final decisions regarding the use of lethal force, even by otherwise autonomous weapon systems.[i] As such, American research and development efforts regarding autonomous weapons systems require very robust command and control communication systems between the weapons and their human controllers. These communication links present significant vulnerabilities for future American autonomous weapon systems.[ii] In contrast, the Russian Federation is enthusiastically pursuing fully autonomous systems that can identify and engage targets completely independent of human control.[iii] While requiring less sophisticated command-and-control systems, the Russian approach does create significant moral and legal vulnerabilities given the impossibility of holding autonomous systems responsible for possible lethal acts that violate international law.[iv] In both cases, national-level policy decisions directly influence the development of each countries’ respective autonomous weapon systems. Since all the American and Russian robotic weapon systems are still in development, it remains to be seen which approach will prove most advantageous on the future battlefield.

The American approach to robotic weapon systems relies heavily on human control over the application of lethal force. As such, U.S. development efforts focus on the concept of "autonomous wingmen" that respond directly to and support manned American platforms and troops. The U.S. Air Force Research Lab, under the “Have Raider” program, has already test-flown heavily modified and largely autonomous F-16 Fighting Falcon prototype aircraft that can take off, maintaining formation with a manned aircraft, and employ weapon systems in at the direction of a human pilot.[v] This developmental system allows the U.S. to maximize its airborne firepower despite historic limitations on the number of available pilots for manned platforms. The U.S. Army, under its own “Wingman” program, is also preparing to undertake initial operational testing of a heavily modified, autonomous High Mobility Multi-Wheeled Vehicle that can engage targets with machine guns at the direction of American troops.[vi] The Army “Wingman” system could provide heavy support firepower in a package that is more expendable than living American troops. However, both of these developmental systems will require significant command-and-control communication capabilities that are vulnerable to hacking, disruption, and even cooptation. At the least, an adversary might exploit these vulnerabilities to render American autonomous systems inoperable. In a worst-case scenario, these systems might be turned against their American controllers.[vii] U.S. policy and developmental efforts for robotic fighting systems include significant legal and ethical safeguards that might limit their battlefield utility.

As a matter of official policy, the Russian Federation has stated its intent to field robotic fighting vehicles with full autonomy to exercise lethal force. This approach maximizes battlefield flexibility for Russian forces and may very well minimize the negative impact of Russia's growing military manpower shortage, though it does expose Russia to significant ethical and legal vulnerabilities.[viii] Russia's new line of Armata universal armored fighting vehicles were designed specifically to be upgraded for autonomous use at a later date, creating the potential for a future in which battalions of autonomous armored vehicles range across the battlefield sowing death and destruction without Russian casualties.[ix] What remains to be answered is who would be held accountable for any potential violations of the laws of war by such fully autonomous systems. Perhaps more worrying is Russia's publicly stated plans to develop an autonomous undersea nuclear weapon, the "Status-6" Oceanic Multipurpose System, with the proposed mission of delivering a 100-megaton nuclear device to American ports.[x] Russian president Vladimir Putin suggested that his country would only use the “Poseidon” as a "last-ditch" measure to defend the Russian Federation. In March 2018, the “Status 6” was officially renamed by the Russian government as “Poseidon”.[xi] Admittedly, many experts believe that the “Poseidon” might be purely in the aspirational stage or simply a deception measure designed to vex Western military strategists. However, a detonation of the system’s proposed nuclear payload could create an irradiated tidal wave intended to render U.S. ports and coastal regions uninhabitable for decades. This unmanned, autonomous weapon system would travel at high speeds and extreme depths to bypass all current American defenses and detonate its deadly payload near the American coast. The “Poseidon” would conduct this attack without outside intervention following a human decision to launch.[xii] In terms of flexibility and utility, proposed Russian systems would largely escape the command and control vulnerabilities that will inherently plague American fighting robots. However, this approach does raise thorny ethical and legal questions regarding the autonomous application of lethal effects against targets. Given recent controversial Russian military activity in Ukraine and Syria, it is possible that such questions do not much trouble the Russian government and will not prove a deterrent in the future.

Fighting robots will absolutely play a critical role on future battlefields. Every major world power is pursuing some variation on the theme of autonomous fighting vehicles, though their applicable policies and resultant developmental strategies suggest that reality will be some hybrid of their respective visions. American insistence on human control of lethal effects ensures continued reliance on "human–machine teaming" that requires robust communication infrastructure that is vulnerable to manipulation. By comparison, Russia's evident comfort with autonomous lethal decision-making will lead to more agile systems that nevertheless create significant moral and legal hazards with which the Russian Federation will need to contend. While the vision of future combat is very much in flux, the inevitability of fighting robots on future battlefield demands policy and developmental consideration today.

End Notes

[i] Department of Defense Directive 3000.09 (“Autonomy in Weapon Systems”), U.S. Department of Defense, May 8, 2017, accessed September 19, 2018,http://www.esd.whs.mil/Portals/54/Documents/DD/issuances/dodd/300009p.pdf.

[ii] Nikolas Shashok, “Analysis of Vulnerabilities in Modern Unmanned Aircraft Systems,” December, 13, 2017, accessed September 19, 2018,https://pdfs.semanticscholar.org/6a62/83afd110991828e30cac142d43c444fce5ed.pdf.

[iii] Kelsey Atherton, “Russia prepares for a future of making autonomous weapons,” C4ISR, June 11, 2018, accessed September 19, 2018, https://www.c4isrnet.com/electronic-warfare/2018/06/11/russia-prepares-for-a-future-of-making-autonomous-weapons/

[iv] Hillary Lamb, “Russia rejects potential UN ‘killer robots’ ban, official statement says,” Engineering and Technology, December 1, 2017, accessed September 19, 2018,https://eandt.theiet.org/content/articles/2017/12/russia-rejects-potential-un-killer-robots-ban-official-statement-says/.

[v] “U.S. Air Force, Lockheed Martin Demonstrate Manned/Unmanned Teaming,” Lockheed Martin, April 10, 2017, accessed September 19, 2018, https://news.lockheedmartin.com/2017-04-10-U-S-Air-Force-Lockheed-Martin-Demonstrate-Manned-Unmanned-Teaming.

[vi] Kyle Mizokami, “The U.S. Army’s Robotic Humvee, the Wingman, Is Learning to Shoot,” Popular Mechanics, February 7, 2018, accessed September 19, 2018,https://www.popularmechanics.com/military/weapons/a16639480/the-us-armys-robotic-humvee-the-wingman-is-learning-to-shoot/.

[vii] Interview with Paul Scharre, “Killer Robots and Autonomous Weapons,” Council on Foreign Relations, June 1, 2018, accessed September 19, 2018, https://www.cfr.org/podcasts/killer-robots-and-autonomous-weapons-paul-scharre.

[viii] Patrick Tucker, “Russia to the United Nations: Don’t Try to Stop Us From Building Killer Robots,” Defense One, November 21, 2017, accessed September 19, 2018, https://www.defenseone.com/technology/2017/11/russia-united-nations-dont-try-stop-us-building-killer-robots/142734/.

[ix] Paul Scharre, “Meet the New Robot Army,” Wall Street Journal, April 11, 2018, accessed September 19, 2018,https://www.wsj.com/articles/meet-the-new-robot-army-1523455200.

[x] Alex Lockie, “Why Putin's new 'doomsday' device is so much more deadly and horrific than a regular nuke,”Business Insider, March 15, 2018, accessed September 19, 2018, https://www.businessinsider.com/putin-doomsday-status-6-nuclear-weapon-2018-3.

[xi] Nicholas Fiorenza, “New Russian weapons named,” Jane’s 360, March 23, 2018, accessed September 19, 2018,https://www.janes.com/article/78819/new-russian-weapons-named.

[xii] Barbara Starr and Zachary Cohen, “US says Russia 'developing' undersea nuclear-armed torpedo,” Cable News Network, February 3, 2018, accessed September 19, 2018, https://www.cnn.com/2018/02/02/politics/pentagon-nuclear-posture-review-russian-drone/index.html.

No comments: