Oceans of ink have been spilled describing two of the great technological trends of this generation and their impact on warfare. The first is the relentless and rapid improvement in information technology (IT), across fields as diverse as big data analytics, artificial intelligence, and augmented reality. One of its key applications in warfare is to enable inputs from distributed, networked sensors to be integrated and analyzed rapidly, generating timely, actionable information in forms that humans and machines can readily interpret.
The second trend is related but distinct: the increasing capabilities of unmanned systems to perform valuable missions. These capabilities are growing not only because of advanced IT enabling more autonomous operations, but also because of improvements in materials science, energy storage, design, and other areas. A third trend, much less remarked on, is the improvement in sensors, which are becoming smaller, cheaper, and more perceptive, with lower power demands and greater durability in various environments.
However, even seemingly ubiquitous sensors, hosted by increasingly capable unmanned systems and interpreted by ever-smarter algorithms, will not result in omniscience. Two decades ago, in his essay, “Crack in the Foundation,” Lieutenant General H. R. McMaster shattered the belief that future U.S. forces would have total knowledge of the battlespace and all that pertained to it.1 Using insightful logic and compelling examples, McMaster made a clear case that Clausewitz’s “fog of war” could not be eliminated simply by using advanced technologies for intelligence, surveillance, and reconnaissance (ISR). Even sophisticated, perceptive networks can be degraded by severing their communication links or targeting key nodes. Perhaps more important, they can be deceived: Despite copious ISR assets in the skies over Kosovo, ample Serbian use of deception resulted in NATO forces’ destruction of only about 7 percent of the armored vehicles they thought they had eliminated.2
While deception is as old as warfare—the history of multiple ancient civilizations is replete with tales of cunning stratagems—its use by the U.S. military has been relatively scant since the fall of the Soviet Union. Given their overwhelming preponderance of power, U.S. forces sought primarily to clear the fog of war, not to exacerbate it for the enemy. This must change, both because potential U.S. adversaries are more capable, and because technological advances are making it easier and cheaper for even lower-end actors to acquire sufficient knowledge of the battlespace to effectively target U.S. forces.
Embracing Disruption and Deception
To that end, the most disruptive innovation that the U.S. military needs is not technological, but cultural: It must embrace deception as a way of overcoming the technologies that can provide adversaries with accurate, timely targeting information. Deception can and must use emerging technologies in a range of contexts, as well as more long-standing approaches.
In some cases, emerging technologies can reduce platforms’ signatures. However, in an increasingly networked world, it will be difficult and costly to reduce the signatures of ships, aircraft, tanks, and other major platforms below adversary detection thresholds. The electronic, thermal, acoustic, and other emissions of these platforms will reveal traces of their presence to discerning networks that can integrate disparate wisps of information into clues about their locations and movements. Moreover, even when emissions have been minimized, adversaries can use radar, active sonar, and other forms of energy to detect U.S. military platforms. The resulting signatures can be minimized (often at high cost), but never eliminated.
Fortunately, much of the impact of detecting and tracking platforms can be negated if an adversary then misclassifies or misidentifies them. Here, emerging unmanned technologies can play a highly disruptive role. It is easier to flood an environment with decoys that resemble the real platforms, evincing similar signatures, than to erase or completely mask the actual platform signatures. Such an approach also can be less expensive than investing solely in stealth to achieve near-invisibility; past a certain point, marginal signature reductions become more costly. If the signatures of the real platforms can be reduced and perhaps masked to appear different than expected, while the signatures of copious decoys are enhanced, an adversary may struggle to know which ones to target or in what direction to focus its defenses.
This type of deception will never be perfect, but it just needs to generate enough confusion to result in delays, inaccurate targeting, and bad operational decisions in a time-critical, high-pressure situation. Creating flickers of ambiguity that can endure seconds to hours can undermine an adversary’s response. The homing algorithms on missiles can be distracted by an array of false targets, some of which can emit greater signatures than real ones. Similarly, the massing of decoy tanks in one location can cause a commander to direct forces to counter them, leaving open a flank that real tanks can exploit. Phantom submarine signatures can lead an adversary to hunt for submarines in the wrong places and allow the real subs to wreak havoc elsewhere.
U.S. forces also can induce procrastination by injecting confusion and uncertainty and can riddle adversary decision-making processes with doubt and confusion. Adversaries receiving contradictory information, or information that starkly conflicts with prior assessments, may not act until it is too late for them to respond effectively. None of this is conceptually new: Ancient Greek, Roman, and Chinese writings describe numerous approaches to deceiving and confusing adversaries. Only the technological context has changed.
Technology-based Deception
There are many ways in which emerging technologies can play a role in deception. Improvements in unmanned vehicles across multiple domains, and in their ability to operate autonomously, will enable the deployment of numerous decoys that can behave and produce signatures like those of real platforms. In other cases, a swarm of decoys may need to operate cooperatively to simulate a larger platform. For example, a collection of small unmanned surface and aerial vehicles could generate some of the electronic, acoustic, thermal, and even radar signatures associated with a ship, while physically distributing themselves to create the impression of overall size. They could even tow large sheets of canvas to create substantial visual signatures.
Alternatively, shrouding false and real ships in limited smokescreens would make them harder to distinguish visually. Unmanned undersea vehicles operating behind false ships could generate bubbles to create a false wake, or unmanned surface vehicles could deliberately churn up the water to create a real one. While it is increasingly difficult to hide warships, the presence of numerous “phantom ships” in their vicinity could cause antiship missiles in a barrage to veer off, leaving fewer to be countered by missile defenses. Saturating the defenses of a carrier strike group is much harder when it is unclear which ships are real and when missiles aimed at the real group may focus on phantom targets. Lurking submarines, given their attenuated access to information from above the waterline, may launch missiles or torpedoes at the wrong targets and then face the wrath of the actual warships.
Advances in materials science and related fields can play a central role in deception. Novel coatings or thin layers of cladding can help mask or reduce the signatures of real systems, while also accentuating decoys’ signatures. Materials science also can contribute to the development of lighter, more energy-efficient unmanned systems. This will be critical to enabling them to persist in the environment for long periods and to generate certain types of signatures. Efficient energy storage and usage—taking advantage of advances in materials science—can enable decoys to emit convincing levels of acoustic and thermal energy. The steady advance of battery technology can reduce the volumes, masses, and costs associated with a given amount of energy storage for decoys without fossil-fuel engines. For decoys with limited or intermittent power requirements, compact, durable, and efficient systems for harvesting environmental energy also can make sense. Much of the needed innovation in this area will come from the private sector, which is pursuing it for economic reasons.
Additive manufacturing (also known as 3D printing) also can contribute to deception by enabling decoys to be made when and where needed, obviating some logistical hurdles and storage limitations. For example, a ship with a 3D printer, select electronics, and raw materials can create and launch a collection of unmanned decoys on short notice. This obviates the need to store voluminous whole systems and spare parts, or to have them delivered in a logistics-constrained environment. The speed of 3D printers and the complexity of what they can produce will grow, facilitating the creation of metamaterials and novel designs.
Electronic warfare (EW) and cyberattacks can exacerbate the effects of deception using physical systems. Old-fashioned EW jamming can attenuate the connections on which ISR networks depend, limiting the information available to decision-makers and thereby enhancing the impact of the deceptive signals that get through. EW jamming or spoofing also can cause sensors and vehicles to misperceive where they are, and therefore to misconstrue the locations of the systems they are observing.
Most important, EW and cyberattacks can cast doubt on the accuracy and utility of various systems, while also slowing down an entire ISR network. Parts of the network that operate only intermittently, or behave erratically, are less likely to be trusted. Credible concerns about EW and cyberattacks degrading information can gnaw on an adversary’s decision-makers as they struggle to interpret a real world full of panoramic illusions. This also can contribute to friction among people and commands as they argue over which systems have been tampered with to what degrees.
Technologically based deception also can take advantage of deficiencies in perception and judgment by machines, individuals, and organizations. Even neural networks whose specific workings are unknown can be deceived through the use of subtle manipulations that cause machines to make mistakes humans never would.3 A modicum of false information, skillfully injected, can lead an adversary’s well-honed artificial intelligence system to misread a situation.
Humans also have numerous interpretive biases, as Amos Tversky, Daniel Kahneman, and other social scientists have argued.4 For example, people tend to become “anchored” to initial perceptions, only belatedly adjusting their views on the basis of more pertinent or better-corroborated information. People tend to assume that small samples are representative of much larger sets: If preliminary observations of platforms turn out to be decoys, the adversary may dismiss subsequent observations (a “boy-who-cried-wolf” effect). Perhaps most dangerous, when analyses generate nonintuitive results, however well-corroborated, the results often are wholly dismissed in favor of more intuitive narratives. Exacerbating the problem, human beings typically are overconfident of their judgment and forecasting skills, as demonstrated by Philip Tetlock, David Dunning, Justin Kruger, and others.5 It always will be tempting for a commander confronted with conflicting, confusing data to fall back on what might be irrelevant experience or hunches and make bad tactical and operational choices.
At the organizational level, conflicting information from different sources can be even more confounding. Individuals and units can start to discount each other’s credibility, as well as that of the machines providing data. This can contribute to fissures within an adversary’s military in ways that degrade the trust on which military operations depend.
The world is replete with networked, distributed sensors and copious information-processing capabilities. In the coming decades, sensor densities will increase in environments from space to the seafloor, and the information technology to interpret their outputs will grow more capable. Integrating emerging technologies to inject conflict and doubt into adversaries’ deliberations can have disproportionate effects, inducing fatal delays and cack-handed responses by frustrated commands. Combining emerging technologies to take advantage of human and organizational biases, as well as the vulnerabilities of adversary systems to deception, can give U.S. forces a competitive edge over disoriented adversaries.
1. LTGEN H. R. McMaster, USA (Ret.), “Crack in the Foundation: Defense Transformation and the Underlying Assumption of Dominant Knowledge in Future War,” student issue paper, U.S. Army War College (November 2003), vol. S03–03.
2. McMaster, “Crack in the Foundation,” 46–47.
3. See, N. Papernot, P. McDaniel, I. Goodfellow, S. Jha, Z. B. Celik, and A. Swami, “Practical Black-Box Attacks against Machine Learning,” in Proceedings of the 2017 ACM on Asia Conference on Computer and Communications Security (New York: 2017): 506–19.
4. Some of this work is described in D. Kahneman, P. Slovic, and A. Tversky, eds., Judgment under Uncertainty: Heuristics and Biases (New York: Cambridge University Press, 1982); and D. Kahneman, Thinking, Fast and Slow (New York: Farrar, Straus and Giroux, 2013).
5. There are many sources on this, including P. E. Tetlock, Expert Political Judgment: How Good Is It? How Can We Know? (Princeton, NJ: Princeton University Press, 2017); J. Kruger and D. Dunning, “Unskilled and Unaware of It: How Difficulties in Recognizing One’s Own Incompetence Lead to Inflated Self-Assessments,” Journal of Personality and Social Psychology 77, no. 6 (December 1977): 1121–34; D. Kahneman and A. Tversky, “On the Psychology of Prediction,” Psychological Review 80, no. 4 (1973): 237–51; and Scott Plous, The Psychology of Judgment and Decision Making (New York: McGraw-Hill, 1993).