Current and projected autonomous technologies that leverage artificial intelligence (AI) will have profound effects on the role humans play in warfare. These sophisticated machines that seek to overmatch human-level capacities and capabilities will change the anthropological dynamics of the battlefield. From reducing the number of administrative and logistics personnel performing back-office activities to augmenting infantry squads with autonomous systems, AI technologies are poised to emerge as a force multiplier.
Autonomous technologies will usher in a new character of warfare that places military operators further from direct battlefield experiences. Some senior military leaders have opined that robots could replace up to 25 percent of combat soldiers by 2030. While the U.S. military cannot ignore the benefits of autonomous machines and their military applications, it raises questions of how the services can protect their rich heritage that has wrought generations of great military leaders and provided inspiration for tomorrow’s fighting forces.
What We Gain
Warfighters on tomorrow’s battlefields are predicted to have autonomous systems that will provide unprecedented capabilities. From technologies that autonomously will find and destroy their targets to autonomous logistics convoys, such advances in warfighting systems can provide an advantage in force protection.1 Military equipment outfitted with AI and connected to larger networks can provide granular measurements and tracking for feedback and optimization. While these technologies provide extraordinary capabilities, they also can fill shortfalls the services face in human capital.
Recent estimates predict the potential pool of eligible military recruits may become inadequate because of rising incidence of obesity, criminal records, and insufficient education. The services could replace various military occupational specialties with autonomous military systems enabled by AI. Hard-to-fill positions in the fields of cybersecurity or more routine vehicle operations already are seeing a dramatic takeover by autonomous technologies.2 Adoption of these machines could bridge the gap between inadequate numbers of volunteers while augmenting the workforce with an “extended intelligence.”
Advancements in autonomous systems are making progress almost daily and are pushing the boundaries of what we thought possible. This push was evident in former Deputy Secretary of Defense (DoD) Bob Work’s Third-Offset strategy that sought “next-generation technologies and concepts to assure U.S. military superiority.” While force protection and superior technology are important, autonomous machines could lead the services down a path that gradually eviscerates the warrior ethos if leaders do not guard against a bias for automation and further stand-off from the realities of war.
What We Risk
The desire to prosecute targets at longer distances in warfare is nothing new—as longbows and catapults would testify. What makes tomorrow’s autonomous and remotely piloted technologies different are that many of these instruments will be equipped with AI that uses algorithms and data to provide everything from automated target acquisition to target value and casualty estimates.3 By offloading these dimensions of warfare from human warriors, militaries that adopt these technologies run the risk of “morally deskilling” their forces.
The idea of moral deskilling is rooted in military virtue. The most important features of military virtue—such as duty, honor, courage, integrity, and loyalty—are indivisible from notions of exemplary military service. Ideals of autonomous weapons acting with “moral knowledge, compassion, or accountability” or demonstrating honor, duty, or courage is an unmerited anthropomorphism of technology that does not exist. Although some argue artificial intelligence could one day exhibit moral reasoning, offloading such decisions to autonomous weapon systems overlooks the impacts it would have on humans no longer performing those functions.4 Even so, many technologies already are distancing humans from the realities of warfare by blurring the lines between civilian and military technologies.
From commercial-off-the-shelf drones and tablet-based technologies such as KillSwitch to XBOX controllers being used on some Navy vessels, it is increasingly easier to use the same technology to play a video game or to lay waste to a human adversary.5 The deceptive nature of technology demonstrates that it is not neutral in its applications to how we fight wars. By further distancing ourselves through technology or making decisions more akin to playing a video game, we run the risk of offloading these important cognitive and moral decisions to algorithms or creating convenient substitutions that remove the unique human dimensions military professionals have performed for millennia. Conditions that provide shared adversity, displays of honor, acts of courage, and comradeship forged in the crucible of battle will become rarer, further reducing altruistic acts that contribute to our warfighting legacy, or worse, lowering the ethical constraints in warfare.
Adopting autonomous technology to equip our future forces in battle must take into consideration our warrior culture. Embracing technologies that offer to ease some battlefield hardships or diminish opportunities for communal danger must account for unintended long-term side-effects. Autonomous systems likely would decrease shared adversity by replacing personnel on the battlefield. Similarly, reflecting on his experiences in World War II and on the general psychology of men in battle, Dr. Jesse G. Gray, notes that, “an hour or two of combat can do more to weld a unit together than can months of intensive training.”6 The introduction of autonomous technologies on the battlefield needs to account for how they may affect the human connections cultivated through difficult times.
Autonomous technology will create similar effects unmanned weapons and intelligence platforms have towards high-level commanders interdicting in tactical actions. Such a construct not only will deprive the tactical commander the ability to exercise his or her own judgment, it has the potential to create leaders bereft of personal experiences that shape decision making. As one former Predator UAV squadron commander stated, “What happens when that lieutenant, who learns thinking the guys in the back are smarter, becomes a colonel or a general. He’ll be making the decisions, but not have any experience.”7
Admittedly, use of autonomous systems on the battlefield is nothing new. In fact, all humans are individual autonomous systems, that operate on “wetware.” The difference, however, is that commanders spend a great deal of time understanding and shaping the behaviors of human autonomous systems through rigorous training, evaluation, and feedback loops. This builds trust and understanding between these individuals and groups that autonomous technologies do not. In environments saturated with technology it will become increasingly difficult for tomorrow’s leaders to develop the “gut sense” that has characterized great leaders of the past. As a result, autonomous systems could reduce further the amount of “reps and sets” for military leaders to hone decision-making abilities while also fostering their warrior ethos.
U.S. Marine Corps Lance Corporal David N. Rodriguez experiments with the Nibbler drone's controlling app. U.S. Navy (Taylor N. Cooper)
The Services Can Respond
To overcome these shortfalls, the U.S. military will require a multiprong approach for enhancing human capabilities to guard against the deskilling effects technologies will have on its warfighters. Understanding that autonomous technologies should not distance troops from dangers or negate direct experiences of warfare, but help them thrive in the midst of it is a step in the right direction. The concept of pairing technology and human capabilities are critically important and must account for the intangibles of warfare these technologies may diminish or take away from human decision makers.
A focus on these technologies to free up more people to perform traditional soldiering is the first place to start. The private sector is using robots and AI to eliminate back-office and repetitive types of occupations. This not only is a lower-risk way to employ nascent AI technologies, it also allows the U.S. military to transition many of these eliminated positions to combat arms and special operations forces that are already over-extended.
A focus on educating the force through more rigorous and broadened professional military education emphasizing a balance of history, ethics, and technology will become critically important as the services experiment and employ autonomous technologies. In addition, this will require a multidisciplinary approach to evaluating these technologies and may necessitate traditional military educational institutions to develop relationships with civilian institutions that are on the cutting edge of these technologies to provide the expertise the services may lack.
Most important, the services will require leaders who understand how forces adapt to these new technologies. The proliferation technologies on the battlefield may make a future battalion commander more akin to a chief technology officer (CTO) than a traditional foot soldier. The roles these leaders play will require them to prepare their young women and men for an uncertain future environment and to understand the employment of autonomous technologies and their limitations. Guarding against a bias for automation will become challenging as these technologies improve and future generations increasingly expect robots and automation to shoulder the burden of dangerous occupations. Such a future would demand leaders shape the military accessions pipeline to ensure the next generation of warfighters are prepared for the rigors of military service.
Preserve the Warrior Culture
Intrepidity, privation, and comradeship in the face of danger adorn the well-worn mantle of battle leader passed from one military generation to the next. These qualities, which have animated warriors and inspired courage in warfare, may be eliminated if autonomous systems become the preferred method for combat engagements. Making warfare similar to a video game or relegating decisions to algorithms will separate military leaders from the realism of war. By making warfare less personal, we risk enfeebling our warrior culture.
Arguably, automating war will challenge historical elements of bravery, courage, honor, and intrepidity that have defined warfare for millennia. The U.S. military stands to lose some of the most significant intangibles that have inspired future generations and created the bonds of association that unite all military professionals. As James Kerr recorded in his book, Legacy, “True leaders are stewards of the future. They take responsibility for adding to the legacy.”8 Hence, service leadership must understand that replacing humans on the battlefield with autonomous technologies could erode the experiences that shape warfighting prowess and military legacies.
Leaders will struggle with how to construct the experiences that forge the bonds of loyalty and teamwork that are hallmarks of military service while finding the appropriate use for autonomous technologies. Klaus Schwab, founder and executive chairman of the World Economic Forum, rightly stated, “Neither technology nor the disruption that comes with it is an exogenous force over which humans have no control.” With this outlook in mind, military leaders need to understand the human factors of autonomous technologies and will need to shape the evolution of these technologies appropriately to preserve our warrior ethos.
1. Chris Matyszczyk, “Scary 'Slaughterbots' video shows danger of autonomous killer drones,” C|Net, November 19, 2017, https://www.cnet.com/news/scary-slaughterbots-video-shows-danger-of-ai-powered-drone-weapons. Jeff McMahon, “Behind Tesla's Headlines, The Military Drives Autonomous Vehicles,” Forbes, October 21, 2016, https://www.forbes.com/sites/jeffmcmahon/2016/10/21/behind-teslas-headlines-the-military-drives-autonomous-vehicles/#47863a86247e.
2. Jack Corrigan, “Three-Star General Wants Artificial Intelligence in Every New Weapon System,” Nextgov, November 3, 2017, http://www.nextgov.com/cio-briefing/2017/11/three-star-general-wants-artificial-intelligence-every-new-weapon-system/142225/. Pedro Hernandez, “Analytics Firm Antuit Launches AI Threat Intelligence Division,” eSecurity Planet, December 21, 2017, https://www.esecurityplanet.com/network-security/analytics-firm-antuit-launches-ai-threat-intelligence-division.html. McMahon.
3. Shannon Vallor, "Moral deskilling and upskilling in a new machine age: Reflections on the ambiguous future of character," Philosophy & Technology 28, no. 1 (2015): 107-124, http://www.ccdcoe.org/publications/2013proceedings/d2r1s10_vallor.pdf, 108.
4. Ronald Arkin, “Governing lethal behavior in autonomous robots,” CRC Press, 2009. Vallor, 109.
5. Gabriel Rosu, “Android Tablet Delivers Air-Strike in Just Four Minutes,” eTeknix, Accessed December 17, 2017, https://www.eteknix.com/android-tablet-delivers-air-strike-in-just-four-minutes/. Travis M. Andrews, “The Navy’s adding a new piece of equipment to nuclear submarines: Xbox controllers,” The Washington Post, September 25, 2017, https://www.washingtonpost.com/news/morning-mix/wp/2017/09/25/the-navys-adding-a-new-piece-of-a-equipment-to-nuclear-submarines-xbox-controllers/?utm_term=.0dd951d3b54a.
6. Jesse Glenn Gray, The warriors: Reflections on men in battle, U of Nebraska Press, 1958, 44.
7. Ibid.
8. James Kerr, Legacy, Little, Brown Book Group, Kindle Edition, 171.
Major Humr is a ground supply officer in the U.S. Marine Corps. He holds a master’s degree information technology from the Naval Postgraduate School and a master’s degree in military studies from Marine Corps Command and Staff College. He currently serves as the graduate education manager for Manpower Management Officer Assignments-3.