The USS Vincennes (CG-49) was among the most technologically advanced and capable guided-missile cruisers in the world in 1988, and her captain, like much of the crew, was a seasoned warrior with years of experience. Yet, when she shot down a civilian airliner on 3 July 1988, killing 256 people, the human-machine system built to increase lethality while providing advanced situational awareness failed under the stress of combat—catastrophically. As the complexity of modern warfare increases and the quest for information superiority drives militaries toward greater technology-assisted decision-making, the Vincennes incident provides a unique perspective on the confluence of technology and human decision-making. The incident begs the question: Why did the pairing of such advanced technology with a seasoned crew fail so badly?
The military’s reliance on technology is increasing. Moore’s law appears to be holding, and advances in artificial intelligence, big data, and deep learning certainly will alter combat decision-making.1 Experience, instinct, and training alone will no longer be considered sufficient. Leaps in technology are coming, and either the United States will get this right or its adversaries will. The Vincennes incident offers guidance on how to integrate future decision-making tools with human decision makers. It offers lessons on technology adoption and use, how humans think about information, the trust they put in those information sources—especially under duress—and the interactions between humans and machines.
The Blood-Dimmed Sea
As Iraq and Iran waged war throughout the 1980s, eventually both countries began targeting U.S. and neutral ships, making the Persian Gulf as dangerous as a flooded Roman coliseum. The Iranians littered the water indiscriminately with sea mines that could remove the husks of ships and men alike.2 Iran and Iraq both possessed stealthy antiship missiles that flew for miles just above the surface. Small-boat “swarm” attacks were a constant threat in the tight waters of the Strait of Hormuz. Compounding the effects of these man-made dangers, sailors had to contend with sickening heat and blinding, sea-skimming sandstorms when the U.S. Navy deployed in the spring of 1987 to this chaotic maelstrom to reopen shipping lanes and guard U.S.-flagged tankers.3
The Vincennes began the morning of 3 July by facing 13 Iranian small boats (Swedish-made “Boghammers”) attacking the ship with small-arms fire.4 During the fight, the Vincennes’s primary .50-caliber machine gun malfunctioned, prompting Captain William C. Rogers III to unmask the other still-functional .50-caliber, by ordering an abrupt full-rudder turn at 30 knots to spin the ship.5 As the Vincennes engaged the Iranian gunboats, the crew detected an unidentified aircraft leaving Iran’s dual military-civilian Bandar Abbas airfield on a vector directly toward the ship.6
The captain of Iran Air Flight 655, Mohsen Rezaian, was unaware of the skirmish below or that he and his 255 passengers and crew were headed toward the advanced antiaircraft cruiser. Consistent with the rules of engagement, the Vincennes made multiple attempts to warn the unidentified aircraft (the “bogey”). The ship’s targeting crew reported several times that the bogey had “veered from the flight path into an attack profile and is rapidly descending at increasing speed directly toward” the Vincennes while “squawking mode II” (transmitting transponder information only used by military aircraft).7
After the targeting officer emphatically but incorrectly reported that this aircraft was an Iranian F-14 Tomcat, 8-nm distant at an altitude of 13,500 feet, Captain Rogers authorized the release of two antiair missiles. The Airbus A300 passenger jet may have gone into an aerodynamically limp freefall for more than a minute after impact.8 Reports from the scene of the wreckage noted that many of the bodies were found wearing lifejackets.9
What Went Wrong
The Vincennes’s crew made costly mistakes, especially interpreting sensor data—the human-machine interface. Post-incident analysis of information retrieved from the Vincennes’s Aegis targeting system showed that Flight 655 continuously was ascending while being tracked, but the crew informed the captain that the unidentified aircraft was descending into attack position. This may have been the most tactically important piece of misconstrued sensor information.10 (It is noteworthy that at the time of the incident, the Aegis’s command-and-control monitors were unable to display the altitudes of tracked aircraft.11) Marine General George Crist’s report on the incident affirms that, “While many factors played in Captain Rogers’ final decision to engage, the last report by [name redacted] that the aircraft was rapidly descending directly toward the ship may have been pivotal.”12
Another crucial mistake came when the crew reported that Flight 655 was squawking transponder signals indicating a military aircraft.13 Data from the Aegis system examined later indicated that the plane had been transmitting in mode III, the type of codes used by civilian air traffic. This misinformation most probably influenced the crew to label the airliner a hostile F-14.
In a perilous combat environment cluttered with civilian and military aircraft, the captain of the most sophisticated antiair technology in the U.S. military was unable to distinguish between hostile and benign aircraft accurately. Under immense stress, he possessed decision-making criteria that were limited to specific—erroneous—information about the aircraft’s flight path and altitude. His crew failed to perceive and communicate the correct data from the targeting system. (The tactical decision-making environment only would have been more complicated had Iran been conducting a deliberate attack with multiple aircraft.) The incident should echo for developers of future decision-making technology.
Below: When the USS Nitze (DDG-94) was fired on by Houthi rebels in October 2016, she failed to detect the incoming missiles. It has been reported that the ship’s Aegis system was not set correctly, raising new questions about the man-machine interface.
Lessons on Technology Adoption
The shootdown of Flight 655 showcases the chaos of combat environments, but it also reveals lessons for technology adoption and its use in stressful situations. These lessons accentuate the opportunities and dangers in human-machine decision-making.
In combat, technology should create time for the decision maker, not simply provide information.
Time is critical during a fight. Leaders will trade space, abandon resources, and conduct delaying actions to preserve time. Of course, the perception of time also matters and changes with circumstances.
For example, the entire flight time of Flight 655 was seven minutes, five seconds; Captain Rogers was aware of the flight’s existence for four minutes before ordering missiles launched. The demand of multiple decision-making responsibilities alters the perception of available time. During the engagement, Captain Rogers assumed tactical control of two additional ships, the USS Elmer Montgomery (FF-1082) and the USS Sides (FFG-14), while preparing to assume command of U.S. combat aircraft approaching from outside the Gulf, all the while engaging multiple surface-borne enemies.14
Modern technology compresses the time available in combat. Advances in weapon accuracy, speed, and range have increased lethality while decreasing the amount of time available to decision makers. Concurrently, the number and type of sensor-data sources, the range and fidelity of these sources, and communication systems all have sped up.
Technology must create time for the commander, mitigating compression while presenting the proper information to the warfighter, at the correct time, and in a useful and usable format. There is a limit to the amount of information humans can process, but the information collected should not be limited. Future systems must be designed to absorb massive quantities of data, yet distribute them in human-digestible amounts facilitated by advanced algorithms. Technology can extend available decision-making time for warfighters by delivering practical and understandable information, including producing accurate and useful correlation and predictions.
The method of delivery will determine whether these streams suppress or exaggerate the chaos of combat. Human-factors engineering is an entire discipline dedicated to understanding how people interact with machines and data outputs, and the Navy and the military must incorporate its principles into system design.
Machines should help overcome human biases.
Humans are fallible, with cognitive limitations that often manifest under stress. These constraints often come in the form of predetermined mindsets, or biases. Confirmation bias, in which the human mind seeks evidence to support preexisting beliefs, may have influenced the Vincennes crew.
Captain Rogers’ belief that an aerial threat was imminent may have been primed by intelligence confirming this bias. According to investigation results, Captain Rogers “admits his judgment was influenced by the [pre-]July 4th intelligence warning, recent F-14 deployment to Bandar Abbas, previous [reports] of the Iranian F-14 squawking Mode II . . . and the ongoing surface engagement.”15 It is possible that once hostilities began, the crew sought evidence to substantiate the idea that the incoming unidentified aircraft was in direct support of the attacking Boghammers.
Likewise, human cognition tends to skew the evaluation of risk through a framing bias that is steeped in emotion and inhibits objective decision-making. When possible outcomes are considered losses, decision makers are likely to assume more risk than when all possible outcomes are considered gains.16 On the Vincennes, a framing bias may have influenced the decision to attack the airliner. At the time of the incident, U.S. ships were determined not to “absorb the first strike” so as to avoid the fate of the USS Stark (FFG-31) almost exactly a year prior.17 The Stark had been attacked by two Exocet air-to-surface missiles fired by an Iraqi F1 Mirage, killing 37 sailors. A desire to protect the ship from a recurrence may have skewed the crew’s emotional state and contributed to Captain Rogers’ preemptive behavior, altering his perception of the risks involved and hindering his ability to determine acceptable risk.
Because of biases such as these, the human mind is not capable of making mathematically optimal decisions in complex environments where multiple significant variables are present. Leaving combat decision-making entirely to machines is undesirable at best and potentially disastrous at worst, but machines can help evaluate accurately some outcomes in terms of statistical risk. As long as the meaning of the product of the machine evaluation is understood, decision makers can use the results to augment their intuition, education, and experience when making combat decisions.
Humans may trust machines, but they always will trust other humans more.
According to the Vincennes’s antiair-warfare officer, who unknowingly reported inaccurate information about Flight 655 to the captain, “data to me doesn’t mean anything, because I reacted to people that I thought that . . . I knew that I had operated with that were reliable . . . and when they reported at short range they had a decreasing altitude, increasing speed, I had no reason to doubt them.”18 Admiral William J. Crowe, Chairman of the Joint Chiefs at the time, affirmed decision makers’ reliance on the human factor: “That these officers relied on information from their combat team is not only reasonable—but is an absolute necessity in a pressure-packed environment.”19 It is the human link that connects machines to decision makers.
The dynamics of trusting machines is complex, and the degree to which people should do so demands an understanding of the risks. Those who work closest to machines must trust the technology but also must understand the machines’ intricate processes, algorithms, and interconnectivity—and their limitations. Complex data systems use ever-newer forms of data science with unique statistical inference techniques; this increasingly complex nature will demand greater technical expertise and training.20
But the adoption of advanced analytical technology must begin now. The sea services must identify data-science officers and enlisted service members early in their education and groom them specifically for these challenges. Also, advanced data technology must be used now in simulations, wargaming, and real-world applications. The Sea Services should continue to send individuals to train with industry leaders such as Google and Amazon to learn to apply state-of-the-art commercial technology to military problems, and the programs should be expanded. To garner the military advantages of the future, the armed forces must build trust in machines through education, practice, and a deep understanding of advanced technology.
The Human Link
The captain of the Vincennes found himself in an untenable situation. In a hostile environment cluttered with civilian air traffic, with a ship and crew unwilling to absorb a first strike, the most advanced missile-cruiser technology could not prevent a tragedy if the human factor failed.
The Navy and the rest of the armed forces depend on technology to maintain military dominance. But it is the crucial human link between combat decision maker and machines that enable this information superiority. The importance given to the development of future human-machine systems—that is, the extent to which the Navy absorbs the lessons of the Vincennes and Iran Air Flight 655—will determine the future of U.S. naval dominance.
1. Aaron Pressman, “Intel Keeps Insisting Moore’s Law Isn’t Dead,” Forbes, 28 March 2017, fortune.com/2017/03/28/intel-keeps-insisting-moores-law-isnt-dead; Peter Singer, Wired for War: The Robotics Revolution and Conflict in the 21st Century, (New York: Penguin, 2009).
2. ADM William Crowe, USN, declassified letter to U.S. Secretary of Defense, “Formal Investigation into the Circumstances Surrounding the Downing of Iran Air Flight 655 on 3 July 1988.”
3. Will C. Rogers and Gene Gregston, Storm Center: The USS Vincennes and Iran Air Flight 655: A Personal Account of Tragedy and Terrorism (Annapolis, MD: Naval Institute Press, 1992).
4. Crowe, “Letter to SecDef.”
5. Gen. George D. Crist, USMC, declassified letter to the U.S. Secretary of Defense, “Formal Investigation into the Circumstances Surrounding the Downing of Iran Air Flight 655 on 3 July 1988 (U).”
6. RADM William M. Fogarty, USN, declassified report to the Commander in Chief, U.S. Central Command, “Formal Investigation into the Circumstances Surrounding the Downing of a Commercial Airliner by the USS Vincennes (CG 49) on 3 July 1988 (U).”
7. Crist, “Letter to SecDef,” 5.
8. Based on Aegis recording data reported in Fogarty Report, 57.
9. GEN Hugh Shelton, USA, Ronald Levinson, and Malcolm McConnell, Without Hesitation: The Odyssey of an American Warrior (New York: Macmillan, 2010), 194.
10. R. N. Roux, and Jan H. van Vuuren. “Real-Time Threat Evaluation in a Ground Based Air Defence Environment.” ORiON 24, no. 1 (2008), 75–101.
11. Fogarty Report.
12. Crist, “Letter to SecDef,” 5.
13. Fogarty Report.
14. Crist, “Letter to SecDef,” 2–3.
15. Ibid., 5.
16. John Maule and Gaelle Villejoubert, “What Lies Beneath: Reframing Framing Effects,” Thinking & Reasoning 13, no. 1 (2007), 25–44.
17. Fogarty Report, p. 21.
18. Crist, “Letter to SecDef,” 3.
19. Crowe, “Letter to SecDef,” 6.
20. Amir Gandomi and Murtaza Haider, “Beyond the Hype: Big Data Concepts, Methods, and Analytics,” International Journal of Information Management 35, no. 2 (2015), 137–44.
Lieutenant Colonel Tingle is the Chief of Strategy, Policy, and Doctrine at the Joint Force Space Component Command. He holds a Ph.D. in public policy from George Mason University, an M.Eng. and an MBA from the University of Colorado, Colorado Springs, and a B.S. in systems engineering from West Point. He writes on research, development, and the application of technology within the Department of Defense.