The decision to attack Iraq was justified in part by reference to intelligence information—as if it were a coherent and consistent body of knowledge. But it is not. Intelligence tries to develop a useful picture of the enemy and his capabilities, and thereby estimate his intentions. Analysts use partial data from available sources, some of which is unevaluated and obtained at different times. They try to predict future force development and sometimes try to estimate the outcomes of conflict. All are hard jobs, even with modern technology.
In looking for things they believe exist, intelligence analysts sometimes miss things they are not looking for specifically. At the tactical level, for example, an observer searching for armored vehicles may miss antiaircraft mountings on them—an omission of vital importance to those who provide air support to ground troops. At the strategic level, naval analysts looking for evidence of submarine construction may miss early signs that a potential adversary has chosen to build cruise missiles instead.
No institution has the resources to do everything it should be doing all the time. Intelligence organizations are no exception. They have to choose what to cover, to what extent, and for how long. As directed by supervisors and policy, analysts focus on attack warnings and developing situations. Intense concentration on a particular target (or purpose) translates to fewer people and less equipment for covering targets that are assigned lower priorities. Events occurring elsewhere are likely to be missed. For example, according to the independent intelligence review panel headed by retired Navy Admiral David Jeremiah in 1999, one reason for U.S. failure to catch the Indian nuclear tests of 11-13 May 1998 was administration-directed concentration on "rogue" nations (Iran, Iraq, Libya, and North Korea) and diversion of analysts for that purpose. Further, intelligence analysts are not interchangeable parts. Experts in special areas, such as forces and weapon types, are of limited use initially when reassigned to new tasks.
In war and peace, important information usually is mixed with irrelevant, sometimes false data—called "noise." Eliminating noise from the signal promptly is tough and depends on judgment. Noise problems are compounded by heavy volumes of information from all sources, much of it sent electronically. The number and variety of sources transmitting can overwhelm the analysts available. Because they need to sort information so rapidly, they can miss important elements. Admiral Jeremiah's panel pointed to the small number of analysts in relation to the flood of incoming data as a major factor in missing India's nuclear tests.
Government officials often claim a certain act will send a strong signal to some state or group. Such statements express the expectation that information transmitted (or displayed) in the hope of reception by another party is perceived as intended. Nonverbal signals are imprecise. Carrier deployments or arms acquisitions are not always understood by receiving parties as the senders intended, whether or not the message transmitted was threatening. Yet, the concept of military deterrence is based on the idea that a nation's acts can discourage adversaries from unwanted behavior. Deployment of U.S. intermediate-range missiles in Europe might have deterred the U.S.S.R. from attacking Western Europe. It also is possible that the Soviets never intended to attack. Regrettably, one seldom hears from targets of deterrence. There often is confusion at the receiving end about the meaning of announced or observed events, such as deployment of a new weapon. Parties in contact who distrust each other will make an anxious reading of observed force changes; those with high degrees of mutual confidence will be more relaxed.
Distortion of information intended for actual or potential adversaries is as old as war: concealing intentions, capabilities, resources, strengths, and weaknesses. Nations monitored by space-based sensors, such as North Korea and Iran, reportedly have underground facilities to house their missile systems. If deception invites speculation about activities inside a facility once it is discovered, it also inhibits emergence of clues for intelligence analysis. Parties who know they are being observed will alter their observable practices to suggest other activities.
Although the information age has brought great benefits, verifying the truth about enemy capabilities and intentions remains extremely difficult. And finding and evaluating guerrilla forces and terrorist movements is even harder. "Buyer beware" remains the best rule for assessing intelligence information.
Mr. Hirschfeld is a senior analyst at the Center for Naval Analyses in Alexandria, Virginia.