24 February 2014

The Future of War (no. 4): We need to protect our personnel from the moral fallout of drone and robotic warfare

FEBRUARY 6, 2014

By Lt. Col. Douglas Pryer, U.S. Army 
Best Defense guest columnist 

Until last year, the Diagnostic and Statistical Manual of Mental Disorders required "actual or threatened death or serious injury, or a threat to the physical integrity of self or others" for a diagnosis of PTSD. How is it that drone operators can suffer PTSD without experiencing physically traumatic events? The answer lies in the concept of "moral injury." 

Dr. Jonathan Shay popularized the term "moral injury" in his book Achilles in Vietnam: Combat Trauma and the Undoing of Character. To Shay, it is the moral component -- the perceived violation of "what's right" -- that causes the most harmful and enduring psychological effects from PTSD-inducing events. Dr. Tick, another psychologist who has counseled hundreds of combat veterans, holds a similar view. Tick contends that PTSD is best characterized not as an anxiety disorder, but as an identity disorder stemming from violations of what you believe that you yourself (or other people that you identify with) could or should have done. 

Other mental health practitioners describe moral injury as something distinct from PTSD, which they see as caused by physical reactions to physical stressors. But moral injury, as Dr. Brett Litz and other leading experts in the field recently defined it, is "perpetrating, failing to prevent, bearing witness to, or learning about acts that transgress deeply held moral beliefs and expectations." Moral injury may follow a physical event, but it can also follow events that are not physically traumatic at all.

Litz and his colleagues agree that, while PTSD and moral injury share symptoms like "intrusions, avoidance, numbing," other symptoms are unique to moral injury. These other symptoms include "shame, guilt, demoralization, self-handicapping behaviors (e.g., self-sabotaging relationships), and self-harm (e.g., parasuicidal behaviors)." They also advocate different treatments for moral injury. While PTSD sufferers may be helped via such physical remedies as drugs and the "Random Eye Movement" treatment, those who suffer from moral injury require counseling-based therapies. 

There may be no stronger case for the existence of moral injury than that presented by drone operators who, far removed from any physical threats to themselves, suffer symptoms associated with PTSD. Indeed, if moral injury is distinct from and not a component of PTSD (as Dr. Brett Litz and his colleagues claim), it is reasonable to conclude that drone operators are misdiagnosed as having PTSD: They actually suffer from moral injury. 

The growing case for the existence of moral injury reinforces the idea that what the military now calls the "human domain" of armed conflict is the most crucial aspect of war -- even, paradoxically, of war waged via remote-controlled machines. Lt. Col. (ret.) Pete Fromm, Lt. Col. Kevin Cutright, and I argued in an essay that war is a moral contest started, shaped, and ultimately settled by matters residing within the human heart. A group can be defeated in every measurable way. It can have its immediate capacity to wage war destroyed. However, if this group's members feel that it is right for them to continue to fight, they will find the means to do so. There is often little difference between believing that it is right to fight and possessing the will to fight, we argued. 

I applied this idea to remote-controlled warfare in another essay, "The Rise of the Machines: Why Increasingly 'Perfect' Weapons Help Perpetuate Our Wars and Endanger Our Nation." Here, I argued that our nation needs to pay much closer attention to the moral effects of our use of remote-controlled weapons. The Law of Armed Conflict always lags behind the development of technology, I wrote, and we should take little comfort in the fact that international treaty does not yet clearly prohibit our use of armed robots for transnational strikes in places like Pakistan, Yemen, and Somalia. 

There is, for example, a profound perceptual problem related to one nation's warriors remotely killing enemy warriors at no physical risk to themselves. You can argue that this is a stupid perception with no historical basis in the Just War Tradition, but the reason for the absence of this idea from this tradition is clear: This technology is new. If you were to imagine robots attacking America and the American military as unable to fight back against the humans controlling these robots, it becomes easier to appreciate why many foreigners (even many living in allied nations) consider transnational drone strikes to be dishonorable, cowardly, or worse, inhuman acts. 

I concluded in this essay that armed robots should only be used in support of human warriors on the ground, except in those cases when a strong argument can be made to the world that a terrorist represents such a threat to the United States that we have the right to execute this terrorist wherever he may be. The alternative, I argued, is to ultimately create more enemies than we eliminate with these weapons -- and to help set the conditions for forever war. 

But our use of remote-controlled weapons must also account for the long-term psychological effects of drone operators' perceptions of right and wrong. International and local evaluations of wars or tactics as illegitimate or unjust often derive from common human perceptions that U.S. servicemembers can look within themselves to find. As Dr. Shay wrote in his conclusion to Odysseus in America: Combat Trauma and the Trials of Homecoming, a book about the inner struggle of ancient and modern warriors to recover from war: "Simply, ethics and justice are preventive psychiatry." In the case of drone operators, care must be taken to ensure operators can be convinced that it is politically legitimate and morally just to kill their human targets and that they do not intentionally or negligently kill non-combatants. 

In closing, the idea that history is cyclical is an ancient one. Hindus have long believed that this is the case. More recently, the Zager and Evans 1968 hit song, "In the Year 2525," described civilization as advancing technologically only to arrive at its starting point. This idea is certainly proving true with regard to the psychological impact of war on those who wage it. Soon, just as cavemen did long ago, America's remote-control warriors will be able to look people in the eyes when they kill them. 

Unless we turn America's servicemembers into psychopaths devoid of conscience (a cure far worse than the ailments inoculated against), we can be sure of one thing: The human cost to our side of this type of warfare will never be as cheap as technocrats dream it will be. 

Lt. Col. Douglas Pryer is a U.S. Army intelligence officer who has won a number of military writing awards and held command and staff posi­tions in the United States, the United Kingdom, Germany, Kosovo, Iraq, and Afghanistan. The views expressed in this article are those of the author and do not reflect the official policy or position of the Department of the Army, Department of Defense, or the U.S. government.

No comments: