This html article is produced from an uncorrected text file through optical character recognition. Prior to 1940 articles all text has been corrected, but from 1940 to the present most still remain uncorrected. Artifacts of the scans are misspellings, out-of-context footnotes and sidebars, and other inconsistencies. Adjacent to each text file is a PDF of the article, which accurately and fully conveys the content as it appeared in the issue. The uncorrected text files have been included to enhance the searchability of our content, on our site and in search engines, for our membership, the research community and media organizations. We are working now to provide clean text files for the entire collection.
The distinction between a truth, an interpreted truth, and a lie is often very thin, and rests in the perception— and perspective—of others.
Thinking about the nature of truth is not something that occupies the time of many military officers. This is the stuff of philosophers, theologians, and theoretical people—not those who make things happen in the real world. Yet such thought goes to the heart of a fundamental frustration of the officer corps: why they are not more credible to the press, to the Congress, and to the American people. As one observer commented:
The bureaucracy in the Pentagon frequently acts as a barrier between the public and the truth, not as a conduit; a huckster, not an informer. . . . [Unlike in the operating forces] there seems to be far less integrity and sacrifice among program managers who are supposed to be equipping the fighters and the policy makers who decide when lives will be put at risk. . . . The troops deserve better.1
Military officers argue that they have risked their lives for their country and have spent their careers in the nation s service. They subscribe to a moral code that is—they believe higher than that of the civilian community in general. So why don’t people believe them?
The fundamental problem is that not everyone agrees on what truth is; statements believed by one to be perfectly acceptable are seen by another as deceptive. Military officers might believe that they always tell the truth, but people on the receiving end of their statements often believe differently. Many civilian staff members and analysts in Washington believe that they have been misled, deceived, or even lied to by military officers seeking to justify and protect programs. Indeed, among staff members on Capitol Hill, the military official sent to defend a troubled program is jokingly referred to as “the designated liar.” How could this be?
There is a school of thought that the military, indeed all organs of government, are interested mainly in power and influence, with little higher moral claim. Lying is simply an
institutional norm to retain power.2 Conversely, there is another school of thought that the problem lies with the listener. The problem—according to this theory—is that the interpreters of statements (e.g., journalists, analysts, and staff members) are pursuing ambitions and agendas of their own. In seeking support for their own preconceived ideas, these outsiders will distort statements made by the military. The problem, therefore, lies not in untruthfulness by the military but in distortions by outsiders.
Where then is the problem? Part of the problem lies in the different roles of advocates (generally the military) and overseers (journalists and analysts) who, of course, approach and interpret facts differently. However, much of the problem lies in different concepts about the nature of truth itself.
Everyone knows that lying is wrong. Everyone knows that the military, like all institutions, has lied occasionally. Much of the literature on this subject was written by the political left during the period 1968 to 1973, when political scientists and foreign policy commentators were furious about the continuing war in Vietnam, and journalists were angry about their treatment by Presidents Johnson and Nixon. This period produced many polemics about lying in government—but little about how real people
could do jobs in the real world and balance the demand:' for personal and institutional success with a valid ethical standard.3 The most comprehensive treatment of truthful' ness is Sissela Bok’s Lying: Moral Choice in Public ani Private Life* in which she looks at this issue in great depth and proposes an important starting point: a definition of lying as “an intentionally deceptive message in the forifl of a statement.”
The military, however, has given little thought to the problem. When the military services think of morality and ethics, they think of the proper use of force, the law of warfare, and the moral basis of nuclear weapons. For in'i stance, two recent collections of essays—The Parameters of Military Ethics and Ethics and National Defense-" had nothing on this issue of truthfulness beyond a fe« brief nods toward personal integrity.5
What follows are four questions that cover specific areas of the subject of truth. The examples that explore these areas are all real and are intended to be just that: exam'I pies, vehicles for discussion.
How optimistic can an estimate be before it is, in effect, a lie?
Figure 1 shows successive five-year cost estimates (if constant dollars) and the actual costs for a particular system. The estimates are wildly inaccurate for future costs until the program has been in production for a number of years. The earliest estimates are the lowest, and estimates are successively raised but take a long time to become reasonably accurate- These estimates rose some 100% over about five years, and were outdated almost as soon as they were briefed. The trend was consistently in the same direction—up. Program managers briefing the earlier, and very inaccurate, estimates were no doubt sincere. They were can-do officers, optimists. Sincerity, optimism, and self-confidence are highly prized military attributes and rightly so. But is there a limit?
A program manager is under a lot of pressure to produce successful programs, but expensive programs—particularly programs that look expensive early on when alternatives exist—tend to get canceled. The program manager, therefore, has strong incentives to produce low estimates.6 A recent RAND study discussed high-visibility acquisition programs: “These pressures created demanding cost, schedule, and performance requirements and fostered a spirit of advocacy that . . . discounted assessments of risk from lower-level
nia
the
r
hui
cep
fur
tur
abl
ha]
grj
sid
ers
W(
to
gr;
in:
Wi
tr<
m
Pi
in
yi
TO
di
TO
e
H
n
k
ti
f
c
1
and!'
lical
iful-
am
epth
n of
orifl
the
and
/ of
few
eas
ese
im-1
SI
he
'of
as
er
re
c- ne e-'
°lc I
Management.. . . [As a result there is) pressure to promise the very edge of plausible outcomes.7
These pressures reinforce the natural inclination of human beings to accept data that fit with their preconCePtions and to screen out data that do not. Taken a step further, the temptation is to confuse what we want the future to be with what it actually will be, or might reasonably be expected to be. The data suggest that this is what happened. The point, however, is not to judge one pro- Sram manager or to fix the acquisition system but to confer how this program history might be perceived by others. Most people looking at this history of cost estimates w°uld be suspicious. It appears to be a systematic attempt to produce unrealistic estimates in order to sustain the pro-am. Was the program manager producing such low, and '"accurate, estimates that he was—in effect—lying?
What is the appropriate test? The program manager ^ould argue that, even though the estimates were extremely inaccurate, he had evidence to support each estimate and he believed each estimate. He could provide explanations in retrospect about why each estimate proved '"accurate. In effect, he is arguing: “It’s not a lie unless y°u can prove that I’m wrong when I make the statement and that I know I’m wrong”—which reflects Bok’s definition of lying as an intent to mislead. If the program manager is sincere, then there is no intent to mislead. However, a critic would argue that this standard is too low, that the estimates were so rapidly out-of-date and so often j^vised upward that a reasonable person would have known that they were probably wrong. Using the low estates, therefore, constituted a lie. As Bok notes, “truthfulness can be required even where the full ‘truth’ is out °f reach.”8 The existence of great uncertainty does not sus- Pe"d a notion of truthfulness.
it-
re
iy
ii
d
e
d
t
Tre there situations where literal truth is not expected?
The shoppers settled on a pair of vases and three long-spouted Arab coffee pots.
“How much?” they asked.
“These treasures are 300 years old, I swear,” said the merchant.
“Nine pounds sterling and not a grush more,” said the shopper.
They heard wails of anguish. The merchant had a large family to feed. His kind heart was being taken advantage of. “Thirteen pounds,” he said.
“Twelve, and that’s final.”
The shop owner sobbed that he was being cheated. He was putty in the hands of these clever women. “Twelve and a half.”
“Deal.”9
Clearly, there are human activities where literal truth is "°t expected, for example bargaining and negotiations, bio one shopping in a bazaar, for instance, would consider birnself deceived. In business negotiations, these depar- hires from the literal truth are sometimes called “strate- §'c misrepresentations.” Are there analogues in the na- I'onal security business?
One example comes from the early days of the F-14 Program, when Grumman and McDonnell Douglas were
competing to design and produce a successor to the failed Navy version of the F-lll. Grumman’s version was favored by the Navy, but its cost was far above McDonnell Douglas’s. To be more competitive, Grumman cut its cost estimate (but not its costs) substantially and got the contract. A senior Navy officer was rumored to have said that the final version would probably cost twice the estimate, but by then it would be too late to stop the program. Later, when large cost overruns appeared, Grumman claimed that the Navy had encouraged its buy-in and was fully aware of its cost manipulation. The Congress and Office of the Secretary of Defense, however, felt
deceived. .
One former Chief of Naval Operations addressed this process: “You know, the wordage that goes on to justify program[s] in Congress are always words of art, to get the damn thing through.” Yet, however common the practice, should outsiders, such as members of Congress, feel deceived? If the justification of major systems is really a ritual, can anything a senior official says be believed?
Here are elements of a ritual game. The service has a high-priority program. Its people, after all, are the ones who will actually have to use the weapons in war, and it has years of experience in the warfare area. While its knowledge and insight is imperfect, it knows better than anyone else what is needed, so it starts a program. It makes optimistic estimates, because it has faith in its ability to accomplish difficult tasks and because higher estimates might jeopardize the program. The estimates are, in any case, highly uncertain, and one guess is as good as another. As one cynical observer noted, “Press releases and most other institutional statements are innocent acts of faith, not to be denounced as lies just because they prove to be totally false.10 Eventually, however, technical problems force substantial revision in the early estimates. The Congress and the press then express shock and concern, but it is too late to cancel the system. This is a repeating dilemma that surprises no one.
From another perspective, however, this practice looks quite different, as though the service is being less than honest, indeed deceptive. Its estimates are systematically wrong in a way that benefits the service’s self interest. The practice looks parochial and self-serving.
How much “spin ” can you put on a story before it becomes a lie?
The test flight, which was “prematurely terminated" by the crash on takeoff, [was reported] as being 80%suc- cessful—successful checkout, successful start-up, successful taxi, and successful ground run all before the disastrous liftoff."
“Spin” is a Washington word that has entered the public vocabulary, defined as “the deliberate shading of news perception; attempted control of political reaction.”12 Certainly, there is nothing wrong with putting one’s own interpretation on a set of facts. Indeed, in a society that values persuasion and advocacy, this is beyond acceptable; it is commendable. But can “spin” go too far?
Consider the following example: In January 1963 at Ap Bac in the Mekong Delta, 2,500 troops of the 7th
ARVN Division, traveling in armored personnel carriers and supported by artillery and tactical aviation, planned to surprise and trap a force of 300 guerrillas. Instead, the ARVN themselves were surprised, and in the confused battle that followed they took many losses. They failed to close with and destroy the enemy, shelled friendlies by mistake, and lost five helicopters. After the guerrillas escaped, the ARVN occupied the guerrillas’ former positions. When asked about the battle, General Paul Harkins, U.S. Army, the senior U.S. adviser, said, “I consider it a victory. We took the objective.”
What Harkins said was not inaccurate. Some tenets of military theory hold that the winner of a battle is the side possessing the field at the end. The South Vietnamese clearly took their geographical objectives and held the field at the end of the battle—ergo, they won—so Harkins was on safe ground. Or was he?
Critics David Halberstam and John Paul Vann both witnessed the battle and have held Harkins up as a fool in postwar literature.13 To them, the battle was clearly a defeat, and Harkins was covering up. On the other hand, Halberstam and Vann both had strong political agendas, so their interpretation need not be binding. What would a reasonable person think? Could Ap Bac be considered a victory?
Most observers would probably see many attributes of defeat in the battle: failure of the South Vietnamese plan, heavier casualties despite greater firepower, and escape of the enemy. Harkins’s analysis— his spin—may have been accurate, but was it true?
Here we see a tension between truth and accuracy, which relates to the veracity of individual elements. Harkins’s statement—and others like it—contain elements that are subject to objective tests. Such tests could establish whether these elements were accurate—or at least accurate in the eyes of an objective person. Truth, however, is different, relating not only to accuracy of the separate elements but also to the import of the statement taken as a whole. Every statement leaves an impression that could be deceptive even it the statement’s elements were accurate. Therefore, just because something is accurate does not mean that it is true.14
A classic example here is the 1984 testimony of CIA Director William Casey on mining the harbors of
also be useful. On 30 August 1993, U.S. Army Rangers! in Somalia executed a predawn raid on a suspected enem) command center. The Rangers blew down walls, broke through doors, and captured a number of people. These people were handcuffed and treated—at best—roughly' perhaps abusively. Later, it was discovered that the cap'' tives were employees of the United Nations and of3, French relief agency, and they were released. In response to media questioning, an Army spokesman said: “[Are we] embarrassed? Hell no. We did a precision operation, anJ we did it flawlessly.”15
It is clear what the Army spokesman meant. The oper-1 ation, as planned, had called for extremely complex ac- ( tions, and the forces executed these actions as planned. Just as clearly, however, capturing U.N- employees had not been planned. In this case- the situation was so obvious that no one accused the spokesman of trying to lie by giving the story his desired spin; he just looked foolish. But the statement does illustrate another definition of “spin”: “Spin” is a lie n<> one really believes, or is ever supposed to believe.16 If “spin consists of “innocuous” lies, how deeply do military officers want to get involved?
mg
seq
bot
anc
ini
inti
Ste
the
SOI
sai
no
He
Ur
flu
no
of
Does secrecy entail lying?
.. among staff members on Capitol Hill, the military official sent to defend a troubled program is jokingly referred to as ‘the designated liar.’”
“When I use a word, Humpty Dumpty said, “i* means just what I choose it to mean—neither more nor less.
“The question is,” said Alice- “whether you can make words mean so many different things- “The question is,” said Humpty Dumpty, “which is to be the master, that’s all.”17
, . on mining .... _______
Nicaragua. He is reported to have told the House Intelligence Committee that, “No, the CIA was not mining the harbors of Nicaragua.” Later it emerged that the CIA had mined the harbor entrances and roadsteads, though not the inside of the harbors themselves. In response, Director Casey felt that because his statement was accurate, he had told the truth. However, the import of the statement was just as clearly deceptive, and the congressmen on the committee felt they had been lied to.
A more current—though less egregious—example might
For most people, there is a difference between secrecy and lying: Lying involves a deliberate act of untruthfulness; secrecy involves only a passive omission. However, it is also true that secrecy is close to lying in the sense that it entails a lack of candor, a compromise with full disclosure. There is also the same problem with differing perceptions. As Sissela Bok argues in Secrets,'* secrecy divides the world into insiders and outsiders. Insiders-— those who know the secrets—are comfortable in their special knowledge and convinced that proper controls and accountability are being exercised. Those on the outside, however, are often suspicious and feel betrayed. How do these arguments apply to military officers operating in the real world?
One place is with spokespeople. Differing levels of secrecy and disclosure often mean that the person making a statement may say something that is to them absolutely truthful—but is in fact false. The classic example here is the experience of the U.S. Ambassador to the United Nations, Adlai Stevenson, who vigorously denied in 1961 that the United States was involved in the preBay of Pigs air raids. To his knowledge, such a denial was absolutely true.19 Unfortunately, however, he was mak-
sp
m
lit
ci
qi
th
sc
eers
iittj
•oke
iese
lily.
:ap-
>fa
nse
we]
and
ief-
ac-
as
.N-
se-
;ed
?ry
;he
ni
ne
efl
>!
n
:s,
!fS
lng a very public, official lie that was soon exposed. ConSequently, he felt betrayed, and the United States looked k°th foolish and untrustworthy.
This is a common experience in both the intelligence and acquisition community because of the large growth ln compartmented “black” programs. This situation raises lnteresting and difficult questions. In a situation such as Stevenson’s, who is lying? Is it Stevenson? But he told ’he truth as he knew it. Is it an official senior to Steven- s°n who knew the truth? But such an official never said anything personally. How can one have lied who did say anything? The responsibility was diffuse.
nowever, to the listeners the situation was clear: The United States lied.
Clearly some secrecy is needed, but does that mean ’hat every official statement must have an implicit footnote; “The whole truth is available only at a higher level °f classification?”
Cover stories fall into the same category as uninformed sPeakers. To preclude having to divulge information that JTllght damage national security, a substitute story—i.e., a
lie
-is prepared. Cover stories are now apparently offi-
Clally sanctioned in DoD,20 and this raises an interesting Question: Is it acceptable for a military officer to say some- ’hing he knows is untrue if he has official sanction to do s°? If so, can he ever really be trusted thereafter?
This issue of official falsehoods came up dramatically after the Cuban Missile Crisis, when the DoD spokesman, Arthur Sylvester, put out information he knew was untrue ln order to let the President return to Washington with- °ut arousing suspicion. Later Sylvester defended his ac- ’>°n: “I think the inherent right of the government to lie ’° save itself when faced with nuclear disaster is basic.”21 press attacked him vehemently and argued that the government would be extremely tempted to use the power ’° lie in areas involving stakes far short of nuclear war, °ut involving political expediency—which was easy for Politicians to confuse with national security.
This is a dilemma for a military officer. Are there times "'hen “officers are required to do things that they be- ’*eve to be intrinsically immoral, which their personal Morality forbids, to achieve the ‘higher duty’ of their official responsibility to ‘accomplish the mission’ or to ‘spare ’heir soldiers’?” Must they “sacrifice their selfish desire hr ‘moral purity’?”22
One way out of this dilemma is a policy to “neither c°nfirm nor deny,” the longstanding practice on inquiries ahout nuclear weapons—the diplomatic equivalent of to comment.” It provides no information, but it gives no Cause for suspicion either. It ensures secrecy without having to lie.
^(>es this concern about truthfulness really matter?
Not if military officers see themselves as advocates for ’heir services and programs. Not if they value bureaucratic "'inning above all. (In which case they are like lawyers, free to make any statement, select any facts, make any argument as long as they do not break a law. But military °fficers must then sink to a level of public trust equivalent to that of lawyers, who rank near the bottom of professions for credibility.) However, if military officers want
more; if they want to be perceived as following a high moral standard; if they want to be believed when they speak; if they want “special trust and confidence” to mean something; then truthfulness—more precisely, the perception of truthfulness—matters greatly. It is not enough to be morally certain of one’s actions; one also must convince others of this moral certainty.
S. L. A. Marshall addressed the issue of truth half a century ago:
[A military officer] has veracity if, having studied a question to the limit of his ability, he says and believes what he thinks to be true, even though it would be the path of least resistance to deceive others and himself.23
This sets a high and honorable standard, but it is not enough for an officer to satisfy himself of his truthfulness. That decision rests with his listeners. If they think he is lying, his own convictions may count for nothing.
And that is the point: Truth is a matter of perspective. We ignore the perspectives of others at our peril.
'Fred Hiatt, "The Guys in the Field Should be Running the Pentagon,” Washing■ ton Post Weekly Edition, 15 September 1986, pp. 23-4. For a harsher view, see James G. Burton, The Pentagon Wars (Annapolis: Naval Institute Press, 1993). 2For instance, Richard F. Kaufman, The War Profiteers (New York: The Bobbs- Merrill Company, 1970), pp. xvi-xvii.
3For examples of polemics by outraged political scientists, see Hannah Arendt Crises of the Republic (New York: Harcourt Brace Jovanovich, 1972); or Peter McClosky, Truth and Untruth (New York: Harcourt Brace Jovanovich, 1972). For examples of outraged journalists, see Joseph Berger, Nothing But the Truth (New York, John Day Co., 1971); Anything But the Truth: The Credibility Gap; and David Wise, The Politics of Lying, 1973.
“Sissela Bok, Lying: Moral Choice in Public and Private Life (New York: Pantheon Books, 1978).
5Lloyd J. Matthews and Dale E. Brown, The Parameters of Military Ethics (McClean, Virginia: Pergamon-Brassey’s, 1989); James C. Gaston and Janis Bren Hietala, Ethics and National Defense: The Timeless Issues (Washington, DC: National Defense University Press, 1993).
’For a recent discussion of these cultural problems, see General Accounting Office, Weapons Acquisition: A Rare Opportunity for Lasting Change (GAO/NSIAD- 93-15, December 1992).
7Carl H. Builder, Jr., The Icarus Syndrome (RAND, 1993).
8Bok, p. 13.
’Leon Uris, Exodus (Garden City, New York: Doubleday, 1958).
“’Robert Kharasch, The Institutional Imperative, p. 83.
"Norman Augustine, Augustine’s Laws and Major System Development Programs (New York: American Institute of Aeronautics and Astronautics, 1982), p. 125. l2WiIliam Safire, Safire’s New Political Dictionary (New York: Random House, 1993), p. 740.
l3John Paul Vann, as reported in Neil Sheehan, A Bright and Shining Lie (New York: Random House, 1988); David Halberstam, The Best and the Brightest (New York: Random House, 1969). Also see Gen. David Palmer, USA (Ret.), Summons of the Trumpet (San Rafael, California: Presidio Press, 1978), pp. 27-38.
uSee, for instance, Art Spikol, “The Truth, the Whole Truth ... Is Omitting an Important Fact the Same as Lying?” Writers Digest, August 1989, p. 14.
15As reported in The Washington Post, 31 August, p. 1. To be fair to the Army spokesman, his full statement may not have been so one-sided. l6Michael Kinsley, “Spin Sickness,” The Washington Post, 10 November 1993, p. A27.
l7Lewis Carroll, Through the Looking Glass (New York: C. N. Potter, 1960). ‘"Sissela Bok, Secrets (New York: Vintage Books, 1983), p. 110. l9This is the conventional wisdom. Some observers believe that Stevenson had some knowledge about the invasion plans. See David Wise, The Politics of Lying (New York: Random House, 1973), p. 37.
^Scientific American, October 1992, p. 20.
21 Saturday Evening Post, 18 November 1967, pp. 10, 15.
22Capt. Thomas J. Begines, “Special Trust and Confidence”; James C. Gaston and Janis Bren Hietala, editors, Ethics and National Defense (Washington, DC: National Defense University Press), p- 11-
“S. L. A. Marshall, The Armed Forces Officer (U.S. Government Printing Office, originally published 1950).
Colonel Cancian is assigned as Operations Officer, 4th Civil Affairs Group. He works in the Office of the Secretary of Defense.