For nearly four thousand years, the horse was as integral to warfighting as weapons and armor. Then the Second Industrial Revolution led to developments in motorized vehicles and aircraft. These new technologies’ experimental use during World War I hinted at a new style of war. What those hints portended was the subject of intense debate in the years that followed. Some recognized that motorized vehicles would dramatically change the way wars were fought. But many others held on to the deep-seated, millennia-old preference for beasts of burden on the battlefield. Only World War II would eliminate any remaining such preference. And with the horses’ disappearance from their old roles, the military services also divested all associated equine training, breeding, and care in exchange for drivers, welders, and mechanics.
We are again in a period of disruptive technological progress—the Fourth Industrial Revolution—and this current era of persistent low-intensity conflict is arguably an interwar period not unlike that of the 1920s and 1930s. Experts are again intensely debating how new technology—in the form of automation, robotics, and artificial intelligence—will change the character of war. Killer robots, loss of control, and abdication of human responsibility in killing another human are all profound issues for us to confront. However, while national security experts often focus on the sexy applications of these technologies (and public attention is most easily earned by apocalyptic characterizations), too little attention is paid to the more obvious and less complicated issue: those same technologies are also reshaping the workforce. In historical terms, this is akin to debating the merits of using vehicles as mobile firing positions on the battlefield while overlooking the way motorized transport would, for example, dramatically transform logistics.
Even if militaries shun killer robots, nonlethal automated and autonomous systems will still play significant roles in warfare—and will do so very soon. In 1980, then Commandant of the Marine Corps Gen. Robert H. Barrow said, “Amateurs talk about tactics, but professionals study logistics.” And people are the resource with the longest lead time. These new technologies will have equally profound impacts on the military labor force as on direct combat and we need to adapt to this new reality. Automation will require reducing certain specialties, re-skilling many service members, and creating entirely new job families. Moreover, the trends will have second-order impacts on who we recruit and how we train while increasing dependencies on data and communications.
Automation is already displacing workers in the global workforce and the body of literature expects this to get more intense over the next ten years. Creating autonomous weapons is both far more difficult and much more contentious than either automating low-skilled jobs or creating narrow artificial intelligence for specific, complex tasks. Unsurprisingly, trends within the civilian workforce show that the tasks that are easiest to automate are highly repetitive, are often manual, and require a low degree of judgment. Conversely, the skills that are most difficult to automate involve applying expertise, interacting with human stakeholders in complex situations, and creativity. One important caveat is that low-skilled jobs performed in unpredictable environments—gardening, plumbing, childcare—will generally see less automation, but many of these roles have low wages already. This is important as it reflects the second-order fear that automation will hollow out the middle class and lead to less socioeconomic mobility, which has implications for the military.
As automation displaces humans in some areas, workers will need to shift into new fields where difficult-to-automate human skills are of critical importance. Experts predict a growing need for particular categories of workers: caregivers, coaches, subject-matter experts, technicians (e.g., software, data, cloud), and educators. Viewed broadly, these industries fall into two categories: improving human performance and improving technological performance and integration. The increased specialization between humans and computers will lead to efficiency gains—the so-called Third Offset—but other impacts such as the need to re-skill workers are less obvious and receive less consideration in the national security community.
It is perhaps predictable that military leaders are focusing on killer robots and missing the big picture. The last eighteen years of war, conducted on a rotational model that constantly cycles units in and out of combat zones, seem to have only reinforced an emphasis on a near horizon line and undercut interest in the specific types of critical thinking, imagination, or modernization needed to conceptualize, as an organization, the full range of new technologies’ applications. Moreover, there are dynamics related to the nature of bureaucracies at play. Digital automation requires digital information and the military has struggled to adapt modern best practices that would enable much of this transformation—from software systems that do not connect with each other to the use of paper or other manual processes. It is no surprise that there are significant barriers to overcome before an institution that struggles to have a user-friendly system for processing official travel can field a lethal autonomous robot.
Regardless of why scholars and senior leaders have so far missed the low-hanging fruit, it is time to begin addressing it. Recruiting and training people for the military occupations of the future will take years and the services can begin reaping the benefits of automation if they start investing now.
Embracing Automation, Re-Skilling Soldiers, and Creating the Jobs of the Future
Again, across the economy, automation and AI will have the soonest impact on roles that fall into three categories: those that are predictable and primarily physical; those related to data collection; and those that involve data processing. Many of the specific fields—food preparation and serving, transportation, construction, and office and administrative support roles, for instance—have direct counterparts in the military. The services should immediately begin phasing out or automating as much of these roles as possible. It’s not sexy, but long before we can create autonomous Terminator-like warbots, we will need autonomous systems that can perform nonlethal manual tasks and use natural language processing to manage personnel paperwork. Increasing automation in these roles will both change how the human workers perform their duties and liberate soldiers who can be re-skilled into new areas.
So where would soldiers displaced by automation be used? In jobs that will see an increased demand. The Army is currently undertaking an effort to improve lethality, especially in its close-combat forces. But it should avoid the temptation to treat this as a technological or hardware challenge, when it is fundamentally about people. Improving the way the Army conducts physical fitness training and the way it measures fitness—inarguably central to soldier effectiveness (and lethality)—is a case in point. Much of the debate around the Army’s new physical fitness test focuses on the cost of the equipment and the difficulty of the events. Improving lethality requires shifting soldiers out of roles that are better suited for machines and into health, conditioning, human performance, and counselor or therapist roles.
Special operations units have dedicated athletic trainers and physical therapists, but we will need both far more to support direct combat units and a wider array of specialists including coaches, counselors, and educators. The services, to varying degrees, already have versions of these roles and they need to re-invest the savings from automation into them to gain further performance improvements.
Lastly, the services need to accept that these trends are not going away and should embrace the new technology rather than keep it at arm’s length—and begin thinking about the workforce they need to succeed in the future operating environment. They shouldn’t worry about purple-haired techies; they should start creating software soldiers and other tech warriors. Here are two steps that they can take now.
- Tactical unmanned vehicle operators: Before we deploy lethal autonomous swarm drones, the services will need to embrace drones. The Army has added small unmanned aerial vehicles to the equipment for many units, but lags in updating the tactical doctrine for their use. To date, only the Marine Corps has announced plans to place drone operator specialists in every infantry rifle squad. It is unclear if this will simply be an infantryman with additional training (and possibly a skill code) or an entirely new military occupational specialty and there are precedents for both cases. Looking further to the future, the careers developed for these specialists can also help to shape careers and training of swarm drone operators and other unique roles for autonomous ground and undersea vehicles.
- More technical experts: The Air Force launched a pilot program in 2017 to create a software development team within the service. Named Kessel Run, the program has been a runaway success and more than validated the concept. The other services should immediately begin similar pilots, but expand them to also include other skills in data science and cloud computing architecture. The combination of skills will create teams that can rapidly build, test, and deploy scalable IT systems that will empower warfighters and accelerate other innovation projects. Having these technologists within the force may lead to applications that are even more imaginative than what is currently conceived. The services can also reap the lessons learned from talent management within the cyber force to avoid early attrition and further improve talent management for technical experts.
Broader Impacts of Automation
While ethicists and policymakers are debating the impact of autonomy on warfighting, it appears as though relatively little thought has been given to the impact of automation on warfighters. What happens when automation leads to greater specialization between humans and computers?
Experts expect automation to displace more men than women in the civilian workforce as men disproportionately fill roles most ready for automation. By contrast, women disproportionately fill roles in the civilian workforce that are difficult to automate and will see increased demand—jobs that demand critical thinking and emotional intelligence. Similarly in the military, technological shifts could create greater opportunities for women to leverage the skills that are most valuable in the types of jobs that will increase in demand.
Additionally, and while it’s likely wishful thinking, the confluence of greater demands for technical skills and increased emphasis on human performance may someday shift basic training from its historical function of weeding out weak recruits to instead focus on developing their unique talents (which will improve satisfaction and retention and drive down recruiting churn—a virtuous cycle).
Depending on how the services classify these new technical and human performance specialties, automation may widen the rift between enlisted and officer ranks and may decrease the role of military service as a mechanism for social mobility. Some experts argue that automation is leaving fewer middle-skill roles in the civilian workforce, and it is critical that the military works to avoid this fate. Failure to do so will raise important concerns of about “who serves” and have a consequent impact on civil-military relations.
Lastly, military recruitment may get a bump thanks to increased AI, but that’s not necessarily a good thing. The military is struggling both to recruit tech talent and to meet its recruitment goals for a workforce that has not yet been disrupted by automation. While AI will make recruiters more effective, automation will also displace civilian workers in historically strong recruiting areas—the heartland and rural areas. The services will have incentives to accept these recruits rather than make the changes to recruiting that are needed to attract the workforce of tomorrow. Service leaders will need to clearly define (and closely measure) the attributes sought in recruits when balancing efforts to find and attract the best talent with adapting the workforce for tomorrow.
When it comes to preparing for AI-empowered warfare, senior defense leaders and intellectuals seem to be missing the forest for the trees. Current paradigms of future war tend to either reflect incremental innovation (artillery that shoots farther) or the sort of science fiction that is always thirty to forty years away. The middle ground is a defense enterprise that sees the changing nature of warfare rooted in information technology hardware and software, and it is long past time to start doing the hard work of truly modernizing the services. To prepare for this future of war, all of the services will need to transform their workforces in ways that, put simply, they are not even discussing.
Image credit: William B. King, US Army
Truly enjoyed the read! Military intelligence seems like a good place to start on merging AI with human performance in order to create an overall more effective force as it already seems to require a balanced EQ/ IQ soldier. What are your thoughts?
The trap with most automation plans is there's a tacit assumption that automation will save resources — time, manpower, or both. With intelligence analysis, most software is focused on identifying a pattern or trend. There's a certain level of goodness to that, so long as those templates are considered as a starting point for analysis, not an end point for decision. There may be a resources savings, but just as easily not.
DoD might be better served by foregoing automation of existing *processes*, and looking at existing commercial solutions for *functions*. As the article points out, things like travel reservations, leave management, and supply chain management are not unique. Even some of the harder problems have solutions outside the DoD.
IED, mine disposal, NBC, water purification, Disaster Relief, Wildfire fighting, Air Traffic Control, and other hazardous duties such as Search and Rescue of wounded and downed pilots could benefit from AI automation and drones if done and employed properly.
The U.S. Army needs to reexamine history and see what, when, where, how, who, and why did it encounter failure in these specific operations and if AI and drones could have helped the situation. What private and public disgraces were there? The failed Operations in Benghazi, Somalia, Niger, Afghanistan, and Congo (hunt for Warlord Kony that didn’t capture him) exemplify the need for more autonomous support for soldiers in the field when there was none close by. These small groups of soldiers were left alone to face overwhelming odds, but what failed and why? Was it communications, lack of heavy firepower, lack of air support, lack of overhead cover, lack of armor, lack of mobility, lack of speed, lack of ammo, lack of Force Multipliers, lack of IED detection, lack of cyber-countermeasures, lack of timely intel and maps, or what that affected the mission to tilt to the opposing force? Once identified, then focus on designing AI and drones to combat such issues.
Not all AI revolves around drones for recon and mapping.
One helpful process for AI that the U.S. DoD should definitely utilize is the tracking of exported and given arms, equipment, and money. If arms were sold, given, or air dropped to allied forces and were thus captured or drifted to the enemy, then AI tracking software and devices should be developed to track the locations and even shut down and sabotage the equipment when necessary. No NATO or U.S. forces want to fight against their own equipment and secrets. The Japanese F-35 that crashed in the Pacific is one good example of having AI render the secrets useless to the enemy, or a drone to sabotage the equipment.
Ensuring safeguards from various threats in the stationary conditions are relatively easier than when on the move, since the area can be cordoned off to prevent any mishap, but when it comes to moving on a road, even when effectively cordoned off, can sometimes prove to be quite a daunting task. Since in the latter case, the priority is not only to provide superior safety, but also ensure utmost secrecy or the plot could be revealed and a valuable life could be exposed to threats.