This html article is produced from an uncorrected text file through optical character recognition. Prior to 1940 articles all text has been corrected, but from 1940 to the present most still remain uncorrected. Artifacts of the scans are misspellings, out-of-context footnotes and sidebars, and other inconsistencies. Adjacent to each text file is a PDF of the article, which accurately and fully conveys the content as it appeared in the issue. The uncorrected text files have been included to enhance the searchability of our content, on our site and in search engines, for our membership, the research community and media organizations. We are working now to provide clean text files for the entire collection.
SZ6
/feq'rs
!tfftcnmvi
/WWl$
<fFoPMHCl
factors
^TEtA |
A./**—'
What is systems analysis? The only way to arrive at logical decisions? A useful method to prepare the arguments needed to support a chosen position? A tool employed by the Office of the Secretary of Defense to delay, defer or disapprove necessary programs? Devotees of systems analysis, as do most experts, tend to confuse the layman with technical jargon and massive statistics. As in most contests, the proponents and opponents often suspect each other’s motives. These attitudes are symptoms that fundamental problems exist, and the existence of these problems indicates the need for a closer understanding of the purposes and levels of systems analysis in naval matters.
Since 1960, systems analysis has occupied a position of increasing importance in the attainment of Navy objectives. It is used to identify these objectives, to determine the force levels required to achieve them, to ascertain the capabilities which these forces must have and the best way to provide these capabilities. While analysis has always been involved in making these decisions, the depth, extent, and discipline of systems analysis as now practiced represent a significant change, the effects of which are being felt at all levels of Navy management and command.
The importance of this change is evident in the increasing number and size of organizational elements assigned to some phase of systems analysis, and in the recent establishment of a flag billet, in the Office of the Chief of Naval Operations, to serve as a focal point for the CNO’s systems analysis needs.
A simple definition of systems analysis might describe it as merely a formalized approach to problem-solving. There is truth in this statement, but the approach is often far from simple. Particularly when applied to military problems, the combination of judgment factors and complex mathematical treatments of data can result in a highly involved process. This process is further complicated by the participation of military officers who do not adequately appreciate the capabilities and limitations of the analytic methods used; and of analysts, both military and civilian, who do not adequately understand the importance and effects of experienced military judgment in the analytical process.
One of the fundamental problems encountered with systems analysis is the application of the right method to answer the wrong question. When the results of the analysis are unacceptable, it is often the analytical process which is discredited rather than the formulation of the problem to which the process was applied. It is in this formulation, or questionframing phase, that the close and effective collaboration of military judgment and analytical methodology is vital to the successful outcome of the analysis. For example a study which does not permit the enemy t0 use his optimum strategy or tactics cannot produce reliable answers as to what our strategy or tactics should be. This mistake has been made, and will happen again, unless the above collaboration is consistently achieved-
Another aspect of systems analysis which tends to be confusing is the use of that term in dealing with problems at different levels of military interest. Systems analysis describes the process used to identify the best choice between various strategies; the best combination of forces to achieve the desired strategic objective; the best combination of weapon systems with which to equip these forces; and the best selection of hardware components with which to build the weapon systems. Because the differences and the dependencies between the various levels are not clearly understood, we find top-level managers spending their valuable time on studies which really should affect decision-making at lower levels. This misuse of talent extends to the technical experts, who arc called in to validate the basic data for high-level studies, rather than devoting their time to ensuring that the data used in the lower level, more technical studies are truly correct, thereby providing the proper basis for inputs to the higher level studies. Another example of this confusion 0 purpose is found in the case of lower-leve analytical studies expanding to involve questions of higher-level strategy and tactics.
The analytical studies which support the various levels of naval problems can be differ' entiated in terms of the types of question* which need to be answered before valm weapon system requirements can be stated- These are:
• What tasks need to be performed, an with what effectiveness?
c°unter it.
. The second level of analytical studies 1 involves Encounter Analysis, which starts pith the descriptions of tasks provided by the °rce Level Analysis, and develops the Ascriptions of performance which systems ttuist provide in order to accomplish these tasks. These studies usually involve the play of small groups of forces in specific tactical situa- ti°ns and environments, opposed by realistic and capable enemy forces. The units of the rces should be described in terms which represent their systems performance param- eTrs, such as Detection, Localization, Attack aud Kill probabilities, plus Availability and ^ftilizati0n. Availability describes the effects system breakdowns and outages for repair, Usually by the formula:
c system ^Pected tr
•What performance is required of systems ln order to accomplish these tasks with the Squired effectiveness?
•What should be the composition of the sVstem(s) to provide this performance?
The answers to the first question should provided by Force Level Analysis. These studies start with national objectives and estimates of enemy threats to the objectives, analyze the capabilities of different types and Quantities of forces to meet those objectives, and, in so doing, define the tasks which the °Ptimum combination of forces must perform. Such studies are characterized by the P^y of large forces covering wide and varied geographical areas and involved in the attainment of significant strategic objectives. The units of the forces should be described in such terms as Kill Probability (PK) and Availability (a). An example of a resulting task definition might be the identification a particular barrier operation and the desired attrition of enemy forces which en-
MTBF
A=------------------------------
MTTR+MTBF
lnere
ktTBp = mean time between failures AtTTR = mean time to repair
Utilization describes the degree to which performance parameters can be -be achieved in practice, dependS upon the effectiveness of the command
and control of the weapon system. Even though a system is capable of providing a certain level of performance, breakdowns or delays in internal communications, for example, may degrade this capability. This level of analysis examines the capabilities of different systems, and combinations thereof, to perform the tasks (e.g., air, surface, submarine) .
Having derived from this analysis the description of the desired system performance, the third level of systems analysis, Unit Effectiveness Analysis, can now investigate all the possible combinations of subsystems in order to arrive at a final definition of the composition of the required system. This analysis involves one complete ship or aircraft, treated as a system, operating in one or more representative tactical situations, and defined in terms of fairly detailed technical descriptions of the performance capabilities of its candidate subsystems, i.e., sensors, weapons, com- mand/control, and countermeasures. The results of this analysis can provide the answers to the questions as to which subsystems are most cost-effective, and, most importantly, which combination of subsystems will form the best weapon system for accomplishment of the tasks assigned.
Of these three, we find many examples of the first and the last types. We do not find many Encounter Analyses. Rather, we observe that the other types of analysis tend to expand beyond the scope related to their defined problem by dealing with questions which are properly handled in an Encounter Analysis. This can result in inadequate treatment of both types of questions. More often, however, the comparative analysis of the Encounter type is simply omitted, and the Unit Effectiveness Analysis is based upon assumptions which are not supported by rigorous analysis.
From the above discussion, we see that, to arrive at the decision which selects a particular weapon system for development, the analysis must start with a clearly stated description of the tasks to be performed and the system performance required to accomplish these tasks. This may sound straightforward but, in practice, it becomes quite difficult. Military men believe strongly in having considerable flexibility in the performance of
48 U. S. Naval Institute Proceedings, October 1968
ships and aircraft in order to be able to cope with unforeseen contingencies. They are loath to quantify these contingencies, particularly with weighting factors which tend to assign levels of importance, since they sincerely doubt anyone’s ability to see clearly into the future. This belief tends to lead them to the conclusion that the next generation of the weapon system or vehicle under consideration must be better in every respect than the latest model now in service or in production. This conclusion, in turn, is usually strongly supported by the material developers, who are eager to exploit the most recent advances in technology by initiating new systems development programs.
Unfortunately, this “best of all possible worlds” yearning is unrealistic, since the resources of talented men, adequate facilities, and available finances are never sufficient to meet all the requirements so generated. The managers responsible for the allocation of these resources turn to the analyst for help in making the hard choices. This usually results in the definition of a system capability less than that which had been stated as a requirement by the military, and the analyst catches the blame.
Another element of unreality often arises from an inadequate appreciation, by both producer and operator, of the true effects of the varying environments on the performance of the proposed weapon system. This is particularly true of systems which inhabit the three-dimensional oceans, whose characteristics are not yet fully known, and whose effects are not well understood. Yet, we have pressed on valiantly to produce systems which exploit the seeds and germs of concepts whose critical limitations cannot yet be described.
The only sensible way out of this unsatisfactory situation is for those responsible for the generation of military requirements to recognize that they cannot have everything, that a decision has to be made, and that, of all the parties involved, they are the best qualified to provide qualitative judgments on future military capabilities. They must then be willing to inject these judgments in usable form into the analytical process; be knowledgeable enough to understand its strengths and weaknesses; and be courageous enough to
A graduate of the U. S. Naval Academy with the class of 1942, Captain Bishop’s service in submarines includes duty >n the USS 0-7 (SS-68), USS P‘ ranha (SS-389), and Segundo (SS-398) (1943 to 1947) and command of the USS Barb (SS-220), Baya (AGSS-318), and Bashaw (SSK-241) (1952 to 1955). He served in the Office of Naval Research’s Undersea Warfare Branch front 1955 to 1958 and in the Submarine R&D Branch, OpNav from 1960 to 1963. He commanded the USS Witek (EDD-848) (1958 to 1960) and the USS Arneb (AKA-56) from 1963 to 1964. He was assigned to the ASW Systems Projects Office from 1964 until 1967 when he assumed command of Amphibious Squadron Seven.
accept the results as the best available basis for decision-making. This turns out to be a very large order, indeed.
Let us assume that all the players in this vital game understand it, and each other* completely. Success is not yet assured. Tbe validity and consistency of the data used in the various levels of analysis remain a problem- Examination of recent weapon systems analy' ses reveals that some use performance data derived from equipment specifications; some take their data from OpTEvFor evaluation re' ports; and all too few use data which are truly representative of the performance of the equipment in the Fleet. We find also that the numbers used to describe Availability are de* rived from a similar variety of sources; and the complexities involved in assigning uu' merical values to Utilization often constrain the analysts to calling it Unity (1), with qualifying clauses.
Some recent and notable improvements 111 this untidy situation have been made, par' ticularly in certain aspects of ASW weap°n systems analysis. Carefully planned and e*e' cuted exercises have produced valuable anu consistent performance data. Stringent com tractual requirements for equipment testing have produced high-confidence-level re^" ability data. Application of comparable top management understanding and concern 10 the acquisition of fleet maintenance data should strengthen the credibility of Aval - ability data, which involves not only the per'
forrnance of the enlisted maintenance technician under operational conditions, but also the efficiency of the logistic support system. Widespread application of these successful measures is needed to provide the proper data base for systems analyses.
Valuable and necessary as these efforts to obtain credible performance data most certainly are, we must not delude ourselves that all uncertainties can be removed from the analysis. We must always keep in mind that the purpose of the analytical effort is to reduce the uncertainties for the decision-maker. ^Vhen the uncertainties are removed from a Problem, the decision has, in truth, been tnade. In the complex world of military systems problems, the probability of eliminating all uncertainty converges on zero.
In our efforts to improve the “systems aPproach” to military decision-making, we '''Quid be inconsistent if we did not treat the
total analytical process itself as a system. That ls> each of the three levels of analysis described above, plus the supporting analyses Conducted at the subsystem level, should be ttnderstood as elements of, and operated together as, a system. This requires a clear definition of the functions of each level of analysis and a clear description of its inputs aud outputs, thus describing the interfaces between levels. When this is accomplished, should be able to operate the “Weapon ”Vstem Analysis System” in both directions, ^bat is, we may generate the need for new eciuipment capabilities by identification of changes in strategic missions and tasks, and conversely, generate changes in strategy and factics through the identification of advances jb Weapon systems technology. The ship- °ard air defense systems are examples of the °rmer; the sea-based deterrent strategy is an example of the latter. The success of these examples should encourage us to make the system” universally applicable. Only when be three levels of systems analysis are under- k°°d and operated as a “system” can we e assured that the necessary consistency e>Usts in the data used, and in the inputs and °utPuts of the individual levels.
Those involved in the application of systems analysis to military problem-solving would do well to predicate the prosecution of the analysis upon careful determination of the answers to the following questions:
• Does the definition of the problem clearly identify it with one and only one level of analysis? (If not, the problem definition is too complex.)
• Does the expected credibility of the results justify use of expensive methods of computation? (This requires hard evaluation of the uncertainties in data and tactical options.)
• Are the needs for insertion of operational judgment factors clearly identified? (If not, the hazard of false assumptions is greater.)
• Is it clear what the nature of the output of the analysis should be in order to facilitate decision-making? (If the sensitive factors in the problem are not covered adequately, significant uncertainties can be overlooked.)
• Do we really know how the output of the analysis will be used? (Post facto realization of the actual decision affected by the analysis is sometimes a shock to the analyst.)
Modern methods of systems analysis are here to stay in the world of military decisionmaking. The value of their scientific contribution to this ancient military art is a function of the professionalism we employ in the application of systems analysis to the great variety of military problems we continually encounter. This professionalism requires that the application of analytical methods to all levels of military problems be considered as a system; that the methods be matched to the problems being considered; that consistency be maintained throughout all levels of analysis; that the capabilities and limitations of both military judgment and mathematical method be recognized and understood; and that both military operators and civilian analysts accept and appreciate each other’s role in accomplishing the desired objective— to obtain for the responsible decision-maker the best possible evaluation of the alternative solutions to the problem.
★