21 October 2018

To Build Infantry for the Future, Look First to the Past

Edward G. Miller

“I remember thinking that it would be simpler, and more effective, to shoot [replacements] in the area where they detrucked, than to have to try to bring them back from where they would be killed and bury them.”

—Ernest Hemingway

Papa Hemingway made his observation about infantry replacements in a novel about WWII, but he was on point. Presumptions about the nature of future close combat may be correct—support from drones, or swarms of drones; battlefield 3D printing of spare parts; mechanical augmentation to improve strength and endurance; ruggedized mixed/virtual reality eyewear with real-time intelligence feeds, and so on. On the other hand, there is no reason to think that infantrymen will sustain a relatively fewer level of casualties in the future than in the past. As George Marshall said, “We expect too much of machines.”


That’s why, earlier this year, the secretary of defense established the Close Combat Lethality Task Force (CCLTF). Charged with maximizing the “lethality, survivability, resiliency, and readiness” of the basic, tactical ground combat unit—the infantry squad—the CCLTF will face both new and old challenges as it develops ways for the Army and Marine Corps to gain more power from the national investment in the armed forces. But the need for lethal ground combat forces, and the recognition of that need, is not new. There are striking parallels in particular between the present and the situation faced in World War II by the Army. Media reports on the CCLTF have noted that infantrymen need a specialized skill set; the military drew the same conclusion in World War II. The need for incentives for the infantry were a priority then, too: hence, the creation of the Bronze Star Medal and the Combat Infantry Badge. Combat pay? Same thing. Attracting the best and most mentally and physically fit soldiers? Another concern the Army had decades ago. Replacements and casualty projections? Improvements in training? It was all there.

DoD has made the first step in redeveloping its core capability. Yet technology, talent management, and hard training cannot overcome bureaucracy and culture wars. DoD must not underestimate the structural hindrances to organizing and deploying a lethal and tough close combat force. Contrary to what we may think today, the Army of World War II did not have first call on every national resource. To get an idea of the potential for organizational churn, we need only look at what happened nearly eight decades ago.

During World War II, Army branches, including the Air Forces, competed for skilled manpower despite combat experience that showed the danger of manning infantry units with soldiers who were below the Army average in physical fitness. Less fit soldiers, who were usually draftees, ended up carrying a rifle. Some influential leaders also believed technological advances would reduce battlefield losses. Replacement projections based on this assumption nearly caused the Army to run out of infantry replacements in the European Theater in late 1944. Volunteers gravitated to the Army Air Forces and technical services (including the author’s father). There was a national debate for some time on the question of whether to draft men as young as eighteen years of age, old to put so-called “pre-Pearl Harbor fathers” into the infantry.

The Army had no standardized scientific test to identify leadership potential. Nor did it have a tool with which to identify people of high intellectual capacity and hold on to them long enough to assign them to a unit that really needed them. It was easier to sort people by age, physical condition, and their score on the Army General Classification Test (AGCT)—a measure of learning ability, not intelligence. While the national pool of prospects that could score highly and be designated as AGCT Category I was about 7 percent, they made up only about 4 percent of the Army in 1943. The Army Air Forces received preferential consideration for these higher-scoring (ACGT Category I and II) troops. Leaders of the Technical Service branches (medical corps, signal, quartermaster, ordnance, etc.) fought hard for their share of quality enlistees and officers.

The Army traditionally emphasized the technical and vocational benefits of service. This created unreasonable expectations for a match between personal preference and needs of the Army. Doctrine writers acknowledged that the infantry was the “basic arm.” Yet the infantry, which should have had its share of the most fit and self-reliant troops, received a significant proportion of men who had lower AGCT scores. Urgency and habit led the Army to put more importance on linking test scores to civilian skills and combat infantryman was most assuredly not a civilian occupation. Just three months before the Army entered the ground war in North Africa in November 1942, 89.4 percent of soldiers in the finance corps were in the top two of the five AGCT classification categories. Just 27.4 percent of infantrymen were in these groups. A mid-war sample of twelve thousand soldiers with combat-related jobs revealed below Army average weight, intelligence, and education. They were over a half-inch shorter than the average height for all soldiers. Even the infantrymen were unsatisfied—only 11 percent of them in a 1943 survey expressed a preference for their branch.

It was evidently a surprise to some that combat soldiers required a level of specialized instruction equal to that of any technician. They had to know how to use a dozen weapons; recognize friendly and enemy aircraft, vehicles, and equipment on a moment’s notice; use captured enemy equipment; render first aid; use a radio; and read a map and navigate cross country under extreme conditions of physical and mental stress. This issue remains valid today.

The Army Specialized Training Program (ASTP), enabled college students to defer their service while they trained for medical, engineering, or other “STEM” fields. This was another drain on the potential high AGCT pool available to the broader force because it promised graduates the opportunity to attend Officer Candidate School, though few did. Despite high losses in combat units, the program continued into 1944, when the Army faced disbanding ten divisions and twenty-nine separate battalions and shipping their soldiers overseas as replacements. The main beneficiaries of ASTP were the over thirty divisions that received an average of 1,500 former ASTP students each. As for Officer Candidate School in general, those who completed the program did not necessarily return to a combat arms unit. This is an example of a well-intentioned program working against wartime reality.

Allocating even fit soldiers was problematic. There were two broad categories of fitness for service—limited and general—though the Army abolished the limited-service category on the grounds that the term had a stigma attached to it. The unintended result was that nearly all recruits were qualified (on paper) for general service. Tens of thousands of soldiers in relatively poor physical condition became instantly fit for worldwide assignment. Since it was competing with the other services and the civilian sector, the War Department eventually prohibited discharge of anyone who could do useful work. From mid-1944, medical personnel used a more comprehensive “profile” system to evaluate general stamina, fitness, and emotional stability (this was the basis of the contemporary “PULHES” system) but the damage to the ground combat units had been done. An office manager might be as likely as a lumberjack to end up in the infantry—and that office manager might not be a particularly good physical specimen.

Until it could digest the lessons of combat, the Army had only theory on which to base its training. It eventually incorporated lessons from overseas theaters and developed programs that included topics like exposure to overhead fire, the movement of tanks over foxholes, and live-fire squad maneuver. Unfortunately, many improvements came after the bulk of the units had deployed. There were significant problems and poor management, but it was impossible to expect simultaneous creation, production, and deployment of a highly proficient force where none had existed before.

Planners presumed that a “technology war” would lead to higher casualties in armor and cavalry organizations, not the low-tech infantry. Today’s focus on improving close combat units has brought attention to historical attrition, but the figures need mentioning again. Infantrymen accounted for 142,962 of the 191,701 deaths among Army ground battle casualties during World War II. They accounted for 661,059 of 820,877 Army ground battle casualties of all types. Then as now, the nation had a limited supply of fit prospects with which to fill the combat units. As casualties grew beyond projections, planners faced the prospect of forcing less fit soldiers into combat units.

The approximate authorized strength of an infantry division was about 14,200 soldiers. In the Pacific, the 96th Infantry Division had nearly 17,000 battle casualties, and the 7th sustained 15,179, with well over 100 percent turn-over in the rifle units. In the Palau Island fighting, infantry units sustained 83.9 percent of all casualties. A third of squad leaders and 31 percent of platoon sergeants were casualties. In the 81st Infantry Division, 43 percent of second lieutenants and 35.5 percent of first lieutenants were battle casualties.

Five infantry divisions in Europe sustained 176 percent battle casualties in the eleven months after D-Day. Ground casualties in the European Theater of Operations (ETO) between September 1944 and January 1945 numbered about three thousand soldiers per day. The shortage of combat replacements in the ETO eventually led to the assignment of most replacements, regardless of physical condition, to ground forces units, creating a cycle leading to the situation Ernest Hemingway observed as a correspondent. Experience in Italy indicated that rifle companies would have a nearly complete turnover of both officers and enlisted men every 2–3 months. Infantry divisions lost as casualties the equivalent of a full complement of second lieutenants every eighty-eight days. To create a sense of esprit in the infantry, Army Ground Forces leadership in mid-war proposed a “Ground Medal,” “Fighter” rank, a specialty badge, and “Fighting Pay.” After the usual bureaucratic wrangling, Gen. George C. Marshall approved the Bronze Star Medal and Combat Infantryman Badge, followed by pay.

Nearly 90 percent of GIs responding to one WWII survey had seen a close friend killed or wounded in action, and 83 percent had seen someone “crack up” in battle, either from something happening to a soldier, or after a GI killed someone. High junior officer losses forced unprepared lieutenants into company command. Some did superbly, while others held on long enough to get men killed through ill-advised tactics. It was also common for nineteen- or twenty-year-olds to begin a battle as a private and find themselves leading platoons within a few days or even hours.

Battle had many variables: terrain, weather, adequacy of supply, competence of leadership, intensity of combat, morale, casualties, enemy resistance, fatigue, or even the differential between an estimate of enemy capabilities versus actual performance. Men had to adjust quickly to killing, uncertainty, confusion, threats to life and limb, discomfort, deprivation from sexual and social stimuli, the sight and sound of wounded or dead men, and instincts for personal safety. Rarely did a soldier face any of these singly. Over a quarter of 1,766 enlisted men surveyed in Italy reported that they had the feeling it was only a matter of time before they, too, would be hit. “All the men have hope of getting back, but most of the hope is that you’ll get hit some place that won’t kill you,” said one combat soldier. Another said, “The longer we survived, the more nervous about our own future we became. By the time I was wounded on 8 January 1945, I was ready for a shoulder wound and an escape from the weather and combat.”

Infantryman Lester Atwell said, “In the end you crack up or you get killed; another guy comes up and takes your place and it starts all over again. The only thing I have to hope for . . . is to pick up a wound that’s not too serious. I know that’s the only way I’ll ever get out of this alive.”

Initiatives to improve Army and Marine infantry must also include dealing with what we view now as PTSD and what was known in WWII as Combat exhaustion (CE)—medical personnel evidently used the term “exhaustion” because it carried lower stigma than “breakdown.” Modern approaches to treatment of PTSD will not eliminate the potentially significant reduction in combat power when such casualties are concentrated in a rather small infantry force, especially if they conduct semi-isolated and distributed operations. CE was not a “reportable” condition in all theaters, but it certainly caused most of the ETO’s nearly 102,000 “neuro-psychiatric” casualties during 1944–45. During the hard fighting that autumn and winter, such casualties accounted for between 9 percent and 25 percent of monthly hospital admissions. New soldiers entering battle sometimes exhibited symptoms like vomiting, shaking, trembling, cold sweat, or loss of bowel or bladder control. Experienced soldiers might have symptoms that included irritability, loss of interest, decreased efficiency, and carelessness. The noticeable drop in the incidence of CE during the progress toward Germany in the summer of 1944 was likely attributable to the expectation of imminent victory. The savage combat of fall and winter, however, impacted “older” soldiers in particular because many believed they had already done their part. They in particular knew that under the “ARFORGEN” system of the time, the only way out of combat was through wounds, victory, or death—and many men believed death would come first.

The CCLTF and any successor programs have a significant amount of work to do in a short time. Technology and doctrine are just two elements of a set of interdependent factors composing a system whose sole purpose is to deploy and sustain a single weapon—the most effective infantryman possible. A large obstacle will be service culture seeking to preserve existing processes, prerogatives, money, and personnel. The rush to show return on investment can also work against such initiatives. Leaders must be willing to overcome cultural and programmatic resistance. They need to make people understand the professional consequences of standing in the way. But there is a roadmap—a potentially difficult one as history indicates.

The first step is to demand quality personnel. A War Department staff officer said in 1942: “The accomplishment of the will of the commander depends, in final analysis, upon the ability of subordinates to make the proper decisions in unpredictable situations on the battlefield. These decisions require sound judgment and initiative—qualities which must be carefully developed and fostered in the training of every individual.”

No comments: