During the past five years, naval officers have found themselves caught up in “studies.” An officer reporting to OPNAV for duty will discover that instead of being slipped neatly into a slot in the military bureaucracy, with historical antecedents of outstanding occupants who had gone on to bigger and better things, he is, instead, handed a horrendous question signed by the Vice Chief of Naval Operations, and told to get off with some scientists and get on with the analysis.
He then discovers that his work has significant impact on Navy matters which involve numerous high-ranking officers and officials in the Pentagon, that the scientist members of his team are a new and strange breed with whom he has to learn how to talk, and that a large number of his Navy contemporaries are engaged in the same form of activity.
Officers of higher rank find suddenly that many of the problems of which they are officially “mother,” are being looked at by a study group and that what this study group does and says will have a lot to do with decisions in their particular bailiwicks.
Finally, officers in the Fleet find themselves directing an increasing share of their work toward producing data from fleet operations and exercises for use in studies and implementing major decisions from DOD that seem to be the result of such studies.
What has brought this about? Is there a procedural revolution occurring in the “decision-making process?” What is going on and why?
Study and analysis are not new to military decision-making. The staff study is as old as warfare itself. No responsible commander ever made a decision without deciding through some analytical process what the outcome would be. So, if the question is asked, “why all the emphasis on studies?” the answer cannot be that we are, for the first time, taking a rational look at our problems, although some superficial observers would have it that way. The answer is somewhat more sophisticated than that, and appears to have at least three elements.
First, the problems faced by the decision maker are much more complex.
Second, the national resources continuously allocated to defense are enormous—the demand for them is even greater.
Third, there are at hand new tools to assist us, or stated in other terms, we have been directed to solve our problems by using new methods.
The first point hardly needs elaboration. Our technology has provided us with a bewildering array of devices for using and controlling force—and it has set loose a great host of political issues which are potential sources of armed conflict.
With respect to the second point, the defense budget speaks for itself. We are constantly reminded that this is a “sector” of our national economy—58 billion dollars is not an intangible, it is 85 per cent of our total output of goods and services.
The proper management of the budget is essential to our economic health, to our survival in a cutthroat competitive world in which economic forces are now recognized as primary determinants of national power.
The third point deserves more consideration in the context of this essay. The new tools and methods are those of economic analysis.
As explained by its advocates, the rationale behind this new approach is as follows: we have, in the Free World, a marvelous and successful mechanism for deciding how to allocate resources in the private sector—what to design, what to build, what to spend our money for—business. You design and build what will sell. If you don’t, you are out of business. According to the press, even the Soviets are admitting this function of a free market, and their government is creating one, using it to help them make decisions about production, because they have no private sector.
We have no such mechanism for judgment about defense expenditures. In a sense, the democratic electoral system provides a guide for public spending, but only in a general sense. If taxes are too high, the administration is voted out of office. If the national defense is so obviously inadequate that the man in the street can perceive it, the same thing happens. This does not help the government in specific terms. For defense alone, it must resolve the complexities of allocating nearly 10 per cent of the total national output of goods and services by some other means.
The joining of the academic mind to practical problems of government in this country, certainly one of the main events of our lifetime, brought economists into defense, and the enormity and complexity of the resource allocation problem led them to reason among themselves. Their classic problem had been that of maximizing the output in goods and services that could be obtained from the limited resources of a given economic system. The only attention given to government expenditure had related to its legality and general propriety. In retrospect, it seems surprising that efficiency criteria in the expenditure of public funds seems to have been neglected.
In looking at the history of the application of economic theory to defense expenditure, one can be more specific about mileposts. The opportunity for economists to apply their analytical tools to defense arose almost accidentally. The U. S. Air Force was engaged, with its sister services, in the constant struggle for a larger slice of the defense allocation. It also wanted to execute its established missions more effectively. This led it to support the Rand Corporation.
The economists at Rand discovered a new application for their tools of analysis. They discovered that alternative military choices could be structured by using the efficiency criteria of economics. By combining economic principles with the scientific and engineering processes involved in questions of technical feasibility, and with so-called operations analysis, which the U. S. Navy had pioneered, a methodology was created to which a new descriptive term, systems analysis, was applied. This method embraced a wide spectrum of academic disciplines— economic, physical sciences, mathematics, and others. It is interesting to dissect the term—systems, because research typically centered on weapons systems and supporting systems for weapons—analysis, because to be objective and thorough in the pursuit of accuracy is to be analytical. There continues to be considerable misunderstanding of the systems analysis approach, some of which may result from its name tag, which overlooks the dominant role of economic principles.
The people who employed this methodology were called systems analysts, and concerned themselves with making choices in an environment characterized by limited resources and unlimited demand. The method resolved a problem along the following typical pattern:
• Objective (what issues is the analysis to answer?)
• Alternative choices (into what options can the resolution of the problem be structured?)
• Costs (what are the costs in resources of each option?)
• Criteria (what are the valid criteria other than cost to be applied?)
A glance at this ideal structure will show that each of the four elements covers a host of issues. The present state of analysis finds the contentious aspects of this approach in its substructure—how to cost a decision properly; how to arrive at valid criteria; how to insure that the range of alternatives is of proper breadth—rather than in the application of the method in principle. The U. S. government has now used for five years some degree of systems analysis in allocating a major portion of the defense budget. The techniques have spilled over into other areas of government, and are spreading to the larger corporations, universities, state and local governments. Systems analysis finds a home in all those areas where the market forces which dominate a private competitive economy are lacking. There should be no illusion as to whether there is a choice in the application of systems analysis to our problems in the Navy. Systems analysis will disappear if and when the problems disappear. Its proper application, then, is our immediate concern.
Viewed in practical terms, by those of us who labor in the great military bureaucracy for the things we believe in as naval officers, we analyze for the Navy. A program, to be funded, must be supported by analysis. We in the Navy have not exactly embraced this fact of life with cries of joy and enthusiasm. We are conservative by nature and by professional tenet. The terribly destructive ways of the sea and air have made us by nature cherish tried and tested means of executing our responsibilities. The world of systems analysis is a new environment. We have recognized its advantages. We are learning to navigate in it, and when you get right down to it, we are doing this because it makes sense. It makes very good sense that our Navy programs are the result of objectivity and clarity, i.e., analysis.
How is the Navy geared to handle the problem? At present, our organization has the general appearance of transition. The reason for this is that the programming, analytical approach is taken care of in a framework that parallels but does not penetrate, except informally, the already existing structure in OPNAV. We have one foot in the new and one foot in the old. A second reason for the transitory nature of the organization is that we have had to go outside the Navy for much of the work to support Navy programs. Many view this as unnatural. It was necessary to do this four years ago because the existing organization did not appear to have the “in-house” capability to handle the tremendous demand for analysis that had overwhelmed it with new concepts.
Two relatively new groups exist under the direction of the Director, Navy Program Planning. The Director of Naval Warfare Analyses coordinates the studies that are directed by the Chief of Naval Operations, programming study requirements and available analytical resources, representing the CNO at the Center for Naval Analyses, and insuring the application of professional judgment to analysis. The Director, Systems Analysis Group, relates existing analysis to Navy programs, provides a strong in-house capability for analysis too urgent or sensitive to be negotiated, and is a strong element in the defense analytical community.* The main analytical resources for the Navy are still to be found in the laboratories and systems commands, but these are not geared to conceptual analysis of the type required for force level issues.
The Navy faces real organizational issues resulting from this new way of looking at things. There are other problems, however, associated with it.
Systems analysis is so new as to be quite imperfect. Its high priest and arbiter, Dr. Alain Enthoven, has said so—in fact, he said it is about where medicine was in the 19th century. To many, this analogy is corroborated in that the application of systems analysis to their problems has about the same taste and ultimate effect as castor oil.
Since systems analysis is mainly an intellectual process, criticism is a natural and essential sequel. To many people, the fact that such criticism is constant and telling deflates the significance of this work. It is a rare occasion, indeed, when two systems analysts can achieve absolute agreement.
Responsible officials to whom the results are applied complain that the results are invalid because of their imperfection. The systems analysts reply that their trade is more an art than a science. The decision-maker closes this argument by saying, “At least I have information that defines the problems and helps me make decisions—before, I had nothing.”
The systems analyst himself is just another man. After a period of fruitful work, he may become so entrenched in his organization and so saturated with institutional ideas that he has lost his independent viewpoint. The people qualified to do the work are few. They are basically committed to other disciplines, and for many men, systems analysis is but a short stage in their careers.
The problem of suggesting a course of action to someone already committed to another one tends to introduce all the difficulties and contradictions associated with value concepts, human behavior, and the communication of ideas. A pure technical or scientific expertise simply does not span these qualifications.
When study and analysis become organized, as is almost inevitable in a rising demand versus limited analytical resources situation, Parkinson’s law takes over. The demand for analytical work is enormous. This means that the Navy must cherish its analytical resources and allocate work with care.
In a 1966 report designed to accompany the bill authorizing funds for the Defense Department, the House Armed Services Committee remarked that . . the almost obsessional dedication to cost effectiveness raises the spector of a decision-maker who . . . knows the price of everything and the value of nothing.” This remark touches on a very serious problem in the application of systems analysis to national security problems—the determination of valid measures of effectiveness and the associated issue of sensible criteria for choice.
Not only are national military objectives apt to be multiple, ill-defined, and complicated, but the measures of their attainment are likely to be inadequate approximations at best. For indicating the attainment of such vaguely defined objectives as deterrence or victory, it is even hard to find measures that point in the right direction. Consider deterrence, for instance. It exists only in the mind, and in the enemy’s mind at that. We cannot, therefore, use some “scale of deterrence” to measure the effectiveness of alternatives we hope will lead to deterrence, for there is no such scale. Instead, we must use approximations, such as the potential mortalities that might be sufficient if war were to come. Consequently, even if a comparison for two systems indicated that one could inflict 50 per cent more casualties on the enemy than the other, one still could not conclude that this meant the system supplies 50 per cent more deterrence. In fact, since in some circumstances it may be important not to look too threatening, one could even argue that the system capable of inflicting the greatest number of casualties provides the least deterrence.
One sensitive problem is that of control of studies. We arrived at the study approach so that large problems could be systematically examined free of bureaucratic constraints. It is conceivable, therefore, that a study might be adversely affected by over-control. If, by over-control, we mean forcing the answer or suppressing an unfavorable one, the risk seems minimal in the glare of attention given to studies by OSD (SA) and PSAC and our friendly sister services. On the other hand, in an area where it is so easy to go astray, firm guidance toward answering the stated problem is essential. Current effort tends toward getting study advisory committees more involved, rather than less, because we recognize the problem of the analyst who is neither equipped for nor oriented to judgment making, and because a study to be fruitful must get the officials with responsibility in its area on board.
Another, and obvious limitation, is that any analysis of current problems is necessarily incomplete. Time and money place sharp limits on how far any inquiry can be carried. Almost all studies must stop far, far short of completion, either for lack of funds, of time, or of justification for spending further funds or time on them.
One of the scarcest of our resources is study leadership. The combination of manager, conceptual thinker, synthesis producer, and practical expert required to handle the framework of content of a major warfare study is so rare that it is seldom found except in high management. Since an intellectual process such as a study always reflects to a high degree the personalities concerned, this, too, makes problems.
More important, however, is the fact that even with no limitations of time and money, analysis can never treat all the considerations that may be relevant. For example, how will some unilateral U. S. action affect NATO solidarity; or will Congress accept economies that disrupt cherished institutions such as the National Guard or radically change the pattern of domestic spending.
Considerations of this type can play as important a role in the selection of, for example, alternative military policies as any idealized cost-effectiveness calculations. But ways to measure these considerations even approximately do not exist today, and they can be handled only by identifying them and applying judgment.
The role of judgment is frequently misunderstood. In a problem as complex, for example, as antisubmarine warfare, judgment factors rather than applied mathematics will frequently predominate. At present, ASW study is characterized by broad areas of uncertainty and, therefore, by many factually unsupportable judgments, cither implicit or explicit. For example, the ultimate employment of specific naval forces in the time of war is unknown. To a large, but unknown degree, enemy responses will be influenced by the actions or forces we are prepared to direct against him. These uncertainties will always be with us and can only be handled by applying judgments to intelligence estimates. Improvements in methodology and the quality of operational data may temper and reduce our dependency upon judgments, but they will not eliminate it.
The problem is to conduct studies that reflect our best judgments, the criteria of the scientific community, and the criteria of OSD as to usefulness.
We have become familiar with the approach to problem solving that initially narrows a question in order to be able to solve it. In theory, systems analysis provides a means by which we can be more selective in narrowing the question. The many variables of a problem can be examined across a broad scope of plausible limits. Such variables as torpedo kill probability can be readily parameterized by such a process. The broad aspects of warfare and national strategy, however, do not lend themselves so easily to this method and, in practice, can be limited only by judgments which are tempered, and to some extent, qualified by analysis.
A related problem is the capability to state strategy and tactics in a form useful to force level judgments. We are exploring these complexities. It is a slow, iterative learning process in which we are often confronted with the option of accepting unverifiable answers or none at all. Specifically, in time of peace, our national interests are served by many lines of communications which provide the transport of those commodities necessary to our economy. Our national strategy neither establishes which of these lines will assume primary significance in time of war, nor does it indicate the limits of our dependency of these lines. Because of such uncertainties, individual studies attempt to pursue those objectives which can be supported by our best judgments to enable us to perceive the limits of our present capabilities and provide a reasonable insight to future requirements.
The message is clear. Studies and analyses are an integral part of the decision-making process; thus, of crucial importance to the professional naval officer. Viewed in their proper light, studies and analyses are only a part of the decision-making process. Judgment and intuition are not replaced, but they are supplemented and strengthened. The process is not computerized decision-making—it is tough, contentious brain work, deserving of the best talent available and the closest attention of responsible officers.
*These two OpNav divisions were recently merged and became the Systems Analysis Division, Op-96, under the direction of Rear Admiral Elmo A. Zumwalt, U. S. Navy.