15 April 2014

Unman Your Battle Stations!



By Commander Michael J. Dobbs, U.S. Navy (Retired)
2013 Unmanned Maritime Systems Essay Contest Winner

Unmanned maritime systems will decisively alter the future of warfare, and the naval profession must adapt to the military and cultural changes under way.

A specter is haunting the descendants of John Paul Jones, Dick O’Kane, and the “fighting” Sullivans. It is animated by Moore’s Law and abetted by practitioners of the American way of war who, as one U.S. Navy recruiting video boasted, now work feverishly “to unman the front lines” in order to reduce the human, monetary, and political cost of war. 1 The meteoric rise of unmanned systems over the last two decades has been breathtaking and presages additional expansion in the number, variety, and autonomy of intelligent fighting machines. Accompanying this increase are military and cultural implications as well as challenges that require immediate attention.

The Rise of the Machines

Unmanned maritime systems (UMS) have expanded exponentially. From 2001 to 2008, unmanned aircraft systems executed 500,000 flight hours, and unmanned ground vehicles conducted 30,000 missions. 2 Although off to a relatively slow start, the Navy is now rapidly developing fleets of unmanned surface and undersea vehicles, in addition to carrier-based unmanned aircraft (See Table 1). One ambitious project is the Large Diameter Unmanned Undersea Vehicle (LDUUV), which will supplement nuclear submarines in covert mining; intelligence, surveillance, and reconnaissance; and antisubmarine warfare (ASW). 3

There is clear evidence that unmanned systems will continue to replace humans at the tactical level of warfare (See Figure 1). Systems such as the Phalanx close-in weapon system, advanced-capability M-48 ADCAP, and the Aegis weapon system already operate at the highest level of the tactical observe-orient-decide-act (OODA) loop, including the use of lethal force. Developed by Colonel John Boyd to help describe the decision cycles used by individuals and organizations, OODA loop is one way to look at the sequence or hierarchy of tasks leading up to the ultimate task of acting. 5

The Navy is moving smartly toward increasing the role of unmanned systems in aviation, and trends in technology and real-world operations suggest the slow extinction of most naval aviators. Although the challenge of replacing a nuclear-powered attack submarine or guided-missile destroyer crew is a few orders of magnitude more difficult than replacing aviators in the cockpit, analytical and decision support systems are displacing some crew members in data-driven areas such as sensor employment, target detection, classification, and the generation of accurate fire-control solutions.

Thinking machines already support warfare at the operational level. Automated planning tools have dramatically increased the pace of strike operations while coordinating the employment of assets from cruise missiles to light and heavy aircraft. At the theater or operational level, a variety of automated tools are also being used to manage ASW (e.g., Undersea Warfare Decision Support System [USW-DSS]), including “recommending” where ASW-capable platforms will be placed and how they will conduct their searches. 6

Interestingly, there is evidence of an emerging phenomenon where combat professionals come to trust and rely on the ultimate wisdom of automated decision tools more than might be expected. Even when men have the ability to veto or override an automated weapon system’s decision, they tend to be loath to do so. Such was the case when the USS Vincennes (CG-49), a Ticonderoga -class cruiser, mistakenly shot down a civilian Iranian airliner in 1988, despite significant concerns the crew had that the automated air-defense system had inaccurately classified a radar contact. 7

Should mankind draw bright lines regarding what thinking machines should not do? It is certainly tempting to exclude them from the strategic level to form a fire-break against the “Skynet” scenario from the movie The Terminator , where the fictitious artificial intelligence system assumes the role of “National Command Authority” and becomes dedicated to the eradication of the human race. Respecting such prohibitions will not be easy, as machines with memories and reasoning power superior to those of the National Security Council and the Joint Chiefs of Staff are created. Human decision-making can be flawed, and a variety of studies have described how cognitive weaknesses such as cultural bias, tendency toward being a “satisfier,” and groupthink limit or bind human rationality. 8 With that in mind, what President or Secretary of Defense wouldn’t want to listen to the “advice” of a highly intelligent machine capable of running hundreds of simulations regarding a variety of response options against a model that factors in thousands of key quantitative and qualitative factors “validated” against scores of historical scenarios?


A Post-Heroic Age?

It has been almost 170 years since Commodore Thomas ap Catesby Jones of the U.S. Pacific Squadron took the initiative to capture Monterey, California, based on (false) reports of war between Mexico and the United States. This move to thwart a probable attempt by Britain to seize California was bold but should not be considered an extreme outlier. 10 U.S. naval officers have historically wielded tremendous autonomy in foreign policy, from diplomacy to the initiation and conduct of hostilities. One cultural change from the ascendency of unmanned systems, however, will be the continued erosion of the naval commander’s initiative and autonomy.

That tradition at sea has been fading over the last few decades, and naval commanders can already empathize with an Army battalion commander in Iraq who reported “he had twelve stars’ worth of generals . . . tell him where to position his units during a battle.”11 This decline will accelerate as unmanned ISR systems and “decision-aids” proliferate. The reason is simple: With more unmanned sensors operating beyond the reach of platform sensors, admirals thousands of miles away will have situational awareness as good as or better than commanders at the “pointy end of the spear,” and more intelligent machines will increasingly tell on-scene commanders the “right answer.”

The proliferation of unmanned systems will also dilute the naval profession’s culture of accountability. When an unmanned system malfunctions and perhaps even “violates” the laws of armed conflict, who is to be held accountable? The commander would rightfully be if his command had not completed preventive maintenance and this oversight contributed to the malfunction. Accountability, though, becomes problematic in the age of thinking machines when software engineers and acquisition program managers are responsible for developing and “training” the unmanned system. 12

One significant driver of cultural changes will be the eclipse of naval aviators, at least as we know them from the film Top Gun or the heroic Torpedo Squadron 8 of the Battle of Midway. The realities of physics—humans can withstand fewer g-forces than airframes—and advances in computer-flown aircraft to where machines will eventually sequence through the OODA loop better than pilots will lead to the gradual disappearance of manned aircraft from carrier flight decks. A Navy led more by nuclear-trained submariners and surface warfare officers will arguably be more risk averse than one strongly influenced by people who self-select into aviation for their willingness to embrace risk. 13

Finally, unmanned systems will further displace sailors from violence, both physically and psychologically. Of course, a variety of older technologies from 16-inch naval guns to cruise missiles have already moved the naval profession toward an age of post-heroic warfare: We are light-years away from wooden sailing ships, Marine sharpshooters in the riggings, and cannon broadsides exchanged within a few hundred yards of enemy men-of-war. Consequently, “By removing warriors completely from risk and fear, unmanned systems create the first complete break in the ancient connection that defines warriors and their soldierly values.” 14

In this post-heroic age, the ideal sailor’s traits will evolve. Whether for selection to boot camp or the U.S. Naval Academy, there will be a shift away from physical fitness and courage toward an even greater emphasis on intelligence. As military professionals retrench into the operational and strategic niches of warfare, the ability to process vast amounts of information and make complex decisions in ambiguous situations will be more important than how fast an ROTC candidate can run a mile or his or her hand-eye coordination.
A Stich in Time

The rise of UMS is moving quickly, and the pace will accelerate. While advancing unmanned technologies there will be significant regret if the U.S. Navy does not tackle some of the ancillary issues that may prove more challenging than improving artificial intelligence and power sources.

Our best warriors should guide machines on their journey toward autonomy, and human engagement will vary throughout this process. The spiral development of unmanned systems will require top tactical and operational practitioners be assigned as long-term “trainer-developers” who remain “on-the-loop” to observe systems navigating OODA and evaluate the quality of decisions being made. The mentors would then help refine algorithms and identify gaps in sensor inputs. Just as Colonel Boyd’s first-hand knowledge and experience in combat aviation was used extensively to develop the F-16 and other military aircraft, the artificial intelligence of future autonomous unmanned systems should reflect the mental processes and even “instincts” of today’s superior naval warriors. Of course, humans will also be required “in” and “on-the-loop” for quite some time to ensure unmanned systems are operating safely and in accordance with the laws of war. In other words, man “in-the-loop” refers to a human controlling much if not all of the action of an unmanned system, while “on-the-loop” refers to a situation of much looser monitoring, but where humans retain veto power over the action of unmanned systems. Man “off-the-loop” refers to fully autonomous unmanned systems who act on their own agency, which could include the use of lethal force.

Acceptance of unmanned systems will be predicated on their safety and reliability. As observed during development of computer-driven automobiles and manufacturing robots, machines may need to prove themselves better than humans. 15 Meeting information exchange requirements (IERs) will be crucial for building trust. Humans will stay “on-the-loop” for decades, but even for autonomous systems the IERs for transmitting system status, battle damage, and full-motion video, and then for pushing intelligence, software updates, and tasking will be tremendous. However, meeting these IERs will further strain over-tasked communications capabilities.

Investments in unmanned systems must be synchronized with future command, control, and communication (C3) architectures lest bandwidth shortages compromise their safe and effective operation. C3 architectures must also be hardened against anti-access/area-denial capabilities designed to degrade or deny U.S. access to space, cyberspace, and the radio-frequency spectrum. Serious consideration should be given to reviving laser- communication development to increase connectivity for all undersea vehicles. Beneath the waves the processes currently used to deconflict submarines and other undersea vehicles have attained only a very low level of automation. Automated systems will be necessary not only to form a “blue” common operating picture, but also to rapidly modify changes to water space assignments to support operational flexibility while still making sure a nuclear-powered attack submarine and an LDUUV don’t go “bump in the night.”

Finally, strong leadership is required to implement the dramatic cultural change associated with the expanded roles and autonomy of UMS. Setting quotas will not be enough. Navy leadership must articulate a compelling vision of why certain naval “unions” will be disrupted and how UMS will be integrated at a methodical and manageable pace. Military change of this magnitude implies that the Chief of Naval Operations will have numerous turf battles to referee over the coming decades.

Charles Darwin theorized that “It is not the strongest of the species that survives, nor the most intelligent. It is the one most adaptable to change.” The naval profession and culture will have to adapt to the dogged march of unmanned systems toward increasingly expansive and autonomous roles. As sailors are displaced at the tactical level they will be free to dedicate their professional lives to adding value at the operational and strategic levels of warfare where machines are probably several decades away from challenging man’s intelligence. While investing in unmanned technologies, the Navy must also have the foresight to ensure that supporting infrastructures and the naval culture are ready for the rise of the machines.

1. See “The Future of Unmanned Naval Technologies: A Discussion with Admiral Gary Roughead,” Brookings Institution 21st Century Defense Initiative, 2 November 2 2009,www.brookings.edu/~/media/events/2009/11/02%20naval%20technologies/20091... [8] .

2. Department of Defense, Unmanned Systems Integrated Roadmap: FY 2009-2034 , 20 April 2009, p 8.

3. Captain Dwaine Ashton, “Unmanned Maritime Systems-Status,” presentation to the 2013 Sea-Air-Space Symposium, 8 April 2013 www.navsea.navy.mil/Media/SAS2013/5.%20Umanned%20Maritime%20Systems.pdf [9] , slide 3.

4. Department of Defense, Unmanned Systems Integrated Roadmap: FY 2009-2034 , 20 April 2009, p. 13.

5. Robert Coram Boyd, The Fighter Pilot Who Changed the Art of War (Boston: Little, Brown, and Company, 2002) 334–339.

6. P. W. Singer, Wired for War: The Robotics Revolution and Conflict in the 21st Century (New York: The Penguin Press, 2009) 124–125.

7. Singer, 124–25.

8. See Graham T. Allison and Philip Zelikow, Essence of Decision: Explaining the Cuban Missile Crisis , 2nd ed., (Longman, January 1999) and Christopher Clark, Sleepwalkers: How Europe Went to War , (Christopher Clark, 2013).

9. PLUS (Persistent Littoral Undersea Surveillance), CFn (Composeable FORCEnet), BAMS (Broad Area Maritime Surveillance), USW-DSS (Undersea Warfare Decision Support System), X-47B (Unmanned Combat Air System Demonstration) and GCCS-M (Global Command and Control Systems-Maritime).

10. David Foster Long, Gold Braid and Foreign Relations: Diplomatic Activities of U.S. Naval Officers (1798–1883) , (Naval Institute Press, 1988) 99–101.

11. Singer, 349.

12. Nidhi Subbaraman, “Activists, UN put ‘killer robots’ in the Crosshairs,” NBC News Future Tech , 29 April 29 2013, www.nbcnews.com/technology/activists-un-put-killer-robots-crosshairs-6C9... [10] .

13. See John Lehman, “Is Naval Aviation Culture Dead?” U.S. Naval Institute Proceedings , September 2011.

14. Singer, 332.

15. Patrick Lin, “The Ethics of Autonomous Cars,” The Atlantic , 8 October 2013, www.theatlantic.com/technology/archive/2013/10/the-ethics-of-autonomous-... [11] .

Commander Dobbs is a 1984 graduate of the U.S. Naval Academy and served on submarines for 22 years. He retired in 2006 after commanding the USS Pennsylvania (SSBN-735). He is a frequent contributor to Proceedings and lectures at the University of Caliornia, San Diego and the University of San Diego.

No comments: