Hostname: page-component-cd9895bd7-q99xh Total loading time: 0 Render date: 2024-12-28T02:36:16.619Z Has data issue: false hasContentIssue false

Quantitative Analysis and National Security

Published online by Cambridge University Press:  18 July 2011

James R. Schlesinger
Affiliation:
University of Virginia
Get access

Extract

The long-time neglect of the military establishment by the economics profession at large is now in the process of being rectified. It hardly seems necessary to justify this awakening interest. Since resource allocation is fundamental in military problems, the economist, whose primary interest is the logic of the allocative process, is particularly well-qualified to make a contribution. To be sure, this contribution is purely formal. Economics helps to select and frame the questions that should be asked, but the data necessary for the decision-maker to make a choice must be supplied from elsewhere. Though economics may be lacking in substantive content, in that it tells how to choose rather than what to choose, nevertheless such clarification of the logic of choice is not useless to the decision-maker.

Type
Research Article
Copyright
Copyright © Trustees of Princeton University 1963

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

1 Hitch, Charles J. and McKean, Roland N., for example, in The Economics of Defense in the Nuclear Age (Cambridge, Mass., 1960)CrossRefGoogle Scholar, before devoting a major section of the book to the problem of achieving what they call “efficiency” in military decisions, treat the question of how much should be diverted to defense in one page of excellent though formal principles (p. 48). Though the authors make every attempt to avoid substantive judgments, the difficulty of avoiding discussion of how much to divert to defense is illustrated by a statement that refers to the postponing of “vital actions such as the dispersing and hardening of our deterrent force” (p. 54).

2 Cf. ibid., 105; though some qualifying remarks take note of interdependence, this is the procedure the authors follow. Arthur Smithies provides a brief discussion of the recent history of this procedural method, and a partial defense of it, in The Budgetary Process in the United States (New York 1955), 121–30, 237–40. In light of the political nature of the decisions that must be made, he seems to regard the practice as inevitable—and he expresses a certain sympathy not only for establishing ceilings, but even for the widely criticized Congressional meat-axe.

3 How commonplace is the usage of “operations research” and “systems analysis” as interchangeable terms may be gathered from any one of several standard sources; see, e.g., Operational Research in Practice, Report of a NATO Conference (New York 1958). Hitch, Charles uses the terms interchangeably in “Economics and Military Operations Research,” Review of Economics and Statistics, XL (August 1958), 200Google Scholar, though mentioning that broader, longer-range, operations research studies are frequently called “systems analyses.” Roland McKean may be credited with developing the association between systems analysis and “higher-level criteria” (see Efficiency in Government Through Systems Analysis [New York 1958], 9), though he also does not distinguish between the two on the basis of the type of problem with which each can cope.

In the present article, the position taken is that the term “operations research” should be confined to efficiency problems and “systems analysis” to problems of optimal choice there by avoiding the confusions that have existed. Unfortunately, the conventional use of the term “operations research” both to describe a technique that some experts feel should be limited to lower-level operational problems, and also to encompass all quantitative analysis, is confusing. I have attempted to differentiate by using “operations research” when the limited meaning is intended, and by using “operational research” as an inclusive and general term incorporating systems analysis.

4 Dorfman, Robert, “Operations Research,” American Economic Review, L (September 1960), 577.Google Scholar

5 The conventional approach is to make estimates (guesses?) of the probabilities, the ultimate decision being based on probability densities. One must remember, however, that these are not large-number cases, but usually unique events. In addition, the significance of the Knightian distinction between risk, which can be calculated for in advance, and uncertainty, which cannot be so calculated and where the decision-maker (entrepreneur) must take upon himself the burden of deciding on the incalculable, has not been fully explored.

6 Kahn, Herman comments: “About six or seven years ago there was a ‘technological breakthrough’ at The RAND Corporation in the art of doing Systems Analysis.” (On Thermonuclear War [Princeton 1960], 119.)Google Scholar Despite a certain flamboyance in the remark, the institutional pride is justified. Much of the pioneering work and much of the very best work in systems analysis has been done at RAND. Many of the criticisms that follow, regarding tendencies in systems analysis, do not apply to RAND work. RAND analysts have been particularly cognizant of the multiplicity of objectives. Kahn and Irwin Mann have criticized the emphasis on mathematical models. They have lampooned the diseases they refer to as “modelism … analysitis … and technocratism.” Albert Wohlstetter has dealt sharply with the use of “manipulatory techniques … cookbook fashion.” The fact that these men have felt it necessary to emphasize these points is, however, not without significance; it is suggestive of the tendencies and tenor of much of the work in systems analysis. Nevertheless, RAND studies have rarely been guilty of ignoring either the multiplicity of or conflicts among objectives. On the contrary, such studies have frequently served to counteract very simple (if non-mathematical) models in the minds of policy-makers. Not infrequently, it is the policy-makers themselves who come to believe that there is a single objective to be maximized, when this is not the case. In such circumstances, systems analyses may provide a corrective.

7 Solandt, O. M., “Concluding Remarks,” in “A Decade of Military Operations Research in Perspective—A Symposium,” Operations Research, VIII (November-December 1960), 857.CrossRefGoogle Scholar Reference should also be made to Dorfman, 613–22, and to G. D. Camp as cited in Dorfman.

8 One is reminded of the story of General Marshall, who dozed off one day during a meeting of the Policy Planning Staff. When he awoke, his aides were still groping with “on the one hand … on the other hand, etc.” “Are you still wasting your time on the preliminaries?” asked the general. “Make a decision.” These problems do not yield to analysis, simple or complex; a decision must be made for which the statesman bears the burden of responsibility: “The buck stops here.”

9 The very name “Weapons System Evaluation Group” (the operational analysis organization for the Department of Defense) implies recognition of the qualitative or evaluative nature of conclusions regarding higher-level problems, but this seems to forestall perception of the more vital ends-efficiency dichotomy. Consider, for example, the following description: “We are likely to be deeply involved in the doctrine of the total system—and in the choice of the appropriate mixture of weapons in the total system. This is a field in which quantitative parameters are particularly difficult to find. Consequently more of our work is done in qualitative terms than is the case for other operations-analysis groups.” (Pugh, George E., “Operations Research for the Secretary of Defense and the Joint Chiefs of Staff,” in “A Decade of Military Operations Research,” 842.)Google Scholar In light of the pressure for quantification, it is interesting to note that the author later looks longingly toward the day when more of the work can be done in quantitative terms.

10 See, inter alia, Hitch, “Economics and Military Operations Research,” 200.

11 In this regard, take note of the words of the retiring president of ORSA, Martin L. Ernst: “.… the plea for improved measures of effectiveness was sounded early in the Society's career. Unfortunately, there has been little response to this plea, and there are few indications that any progress whatsoever has been made in this most fundamental area of operations research. Basically, we still make claims to being one of the quantitative sciences, with employment of qualitative measures being reserved as a last resort, If we engage increasingly on major problems, we will in due course be faced with a choice of alternatives. Following one path, we may give up the emphasis on quantitative analysis and simply employ nonquantitative logic as the basis of our effort. To follow the other path, wherein we maintain our quantitative capabilities, we will be completely dependent on improving our abilities to measure fundamental characteristics of the objects of our studies. Quantitative science simply cannot exist without yardsticks, and herein perhaps lies our greatest weakness.” (“Operations Research and the Large Strategic Problems,” Operations Research, IX [July–August 1961], 443–44; italics added.)

12 “Operational research is badly needed in the United States State Department. The State Department is faced on the one hand by a population, industry, Congress, Senate, and Administration primarily engrossed with domestic affairs, and on the other by a set of world societies, many in the process of national revolution and all in a most vigorous sociological interaction. All aspects of this sociological interaction—political, economic, military, and cultural value system—are in a state of violent and revolutionary change. … The State Department is as yet unprepared for the modern technical aspects of diplomacy, either by education or by prior studies by operational research. This makes it very difficult for its traditionally trained executives to understand and always bear in mind the great twentieth-century interdependence between technology and diplomacy when political policy is being established or political decisions are being made….

“I believe that it is in the State Department and in politics that the greatest possible advances in operational research can be made in the future, and that here there can be a tremendous use of symbolic logic and computers to provide for all of the interrelations in a way that is presently beyond the comprehension of any single human being or of any group of diplomats of reasonable size.” (Johnson, Ellis A., “The Long-Range Future of Operational Research,” Operations Research, VIII [January-February 1960], 78Google Scholar; italics added. By contrast, one might consider the words of Solandt, quoted above.)

13 Wherever sufficient data are lacking, the attempt to give quantitative significance to a model is useless, at best. This need for sufficient data to ascertain the parameters explains why OR is most suitable for repetitive, technical operations. To take a simple example, the problem of diabetic control in principle should be susceptible to the methods of OR. In principle, diabetes is a relatively simple inventory problem. But OR offers no assistance to the practitioner, because in practice sufficient quantitative information is unobtainable without causing endless discomfort and loss of time to the patient. The result is that medical practitioners continue to operate, as they must, by “feel” and intuition.

14 In an excellent analysis of the national security problem with the suggestive title, “The National Security Dilemma: Challenge to Management Scientists” (Management Science, VIII [April 1961], 195–209), Marshall Wood argues that management scientists, through an inclusive systems approach, can help solve our security problem. The orderly analysis that follows, however, would have come into being even if management science had never existed.

15 Allen, R. G. D., Mathematical Analysis for Economists (4th edn., London 1949), 2.Google Scholar Is there any reason to believe that Lord Keynes' judgment is no longer appropriate? He wrote: “Too large a proportion of recent ‘mathematical’ economics are mere concoctions, as imprecise as the initial assumptions they rest on, which allow the author to lose sight of the complexities and interdependencies of the real world in a maze of pretentious and unhelpful symbols.” (The General Theory of Employment, Interest and Money [New York 1936], 298.)

16 Hitch, Charles refers to these as “pretty empty principles” in “The Uses of Economics,” Research for Public Policy (Washington 1961), 99.Google Scholar This is perfectly true, in that they are devoid of quantitative guidance in expenditure decisions, but it is their “emptiness” (or formality) that makes them useful in evaluating goals.

17 Consider, for example, the implications of the following evaluation: “So far the integrity and competence of the major military operational-research groups has remained high. At times their work has been very unpopular in Washington, even within their own services, during this period of crucial decision and controversy—because integrity and honesty do not always provide the best ammunition to win Washington battles. As a result, the Services have in some cases sought with success more amenable operational-research organizations for certain studies, and in two cases have taken organizational and personnel actions to ensure better ‘cooperation’” (Johnson, 16). It is noteworthy that the Army recently severed its long-time association with ORO because of a long-standing dispute over control of research, and established the Research Analysis Corporation, whose findings would presumably be less unsatisfactory to the Army. Dr. Johnson, ORO's director, was quoted as saying that “the Army has wanted to run the research—and that we couldn't tolerate.” (See Missiles and Rockets, June 5, 1961, 11.)