Hostname: page-component-cd9895bd7-dzt6s Total loading time: 0 Render date: 2024-12-26T04:30:09.716Z Has data issue: false hasContentIssue false

On Inductive Logic

Published online by Cambridge University Press:  14 March 2022

Rudolf Carnap*
Affiliation:
University of Chicago, Chicago, Ill

Extract

Among the various meanings in which the word ‘probability’ is used in everyday language, in the discussion of scientists, and in the theories of probability, there are especially two which must be clearly distinguished. We shall use for them the terms ‘probability1’ and ‘probability2'. Probability1 is a logical concept, a certain logical relation between two sentences (or, alternatively, between two propositions); it is the same as the concept of degree of confirmation. I shall write briefly “c” for “degree of confirmation,” and “c(h, e)” for “the degree of confirmation of the hypothesis h on the evidence e“; the evidence is usually a report on the results of our observations. On the other hand, probability2 is an empirical concept; it is the relative frequency in the long run of one property with respect to another. The controversy between the so-called logical conception of probability, as represented e.g. by Keynes, and Jeffreys, and others, and the frequency conception, maintained e.g. by v. Mises and Reichenbach, seems to me futile. These two theories deal with two different probability concepts which are both of great importance for science. Therefore, the theories are not incompatible, but rather supplement each other.

Type
Research Article
Copyright
Copyright © The Philosophy of Science Association 1945

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

Notes

1 J. M. Keynes, A Treatise on Probability, 1921.

2 H. Jeffreys, Theory of Probability, 1939.

3 R. v. Mises, Probability, Statistics, and Truth, (orig. 1928) 1939.

4 H. Reichenbach, Wahrscheinlichkeitslehre, 1935.

5 The distinction briefly indicated here, is discussed more in detail in my paper “The Two Concepts of Probability,” which will appear in Philos. and Phenom. Research, 1945.

6 In an article by C. G. Hempel and Paul Oppenheim in the present issue of this journal, a new concept of degree of confirmation is proposed, which was developed by the two authors and Olaf Helmer in research independent of my own.

7 For more detailed explanations of some of these concepts see my Introduction to Semantics, 1942.

8 See F. Waismann, “Logische Analyse des Wahrscheinlichkeitsbegriffs,” Erkenntnis, vol. 1, 1930, pp. 228–248.

9 See Waismann, op. cit., p. 242.

10 St. Mazurkiewicz, “Zur Axiomatik der Wahrscheinlichkeitsrechnung,” C. R. Soc. Science Varsovie, Cl. III, vol. 25, 1932, pp. 1–4.

11 Janina Hosiasson-Lindenbaum, “On Confirmation,” Journal Symbolic Logic, vol. 5, 1940, pp. 133–148.

12 G. H. von Wright, The Logical Problem of Induction, (Acta Phil. Fennica, 1941, Fasc. III). See also C. D. Broad, Mind, vol. 53, 1944.

13 It seems that Wittgenstein meant this function cw in his definition of probability, which he indicates briefly without examining its consequences. In his Tractatus Logico-Philosophicus, he says: “A proposition is the expression of agreement and disagreement with the truth-possibilities of the elementary [i.e. atomic] propositions” (∗4.4); “The world is completely described by the specification of all elementary propositions plus the specification, which of them are true and which false” (∗4.26). The truth-possibilities specified in this way correspond to our state-descriptions. Those truth-possibilities which verify a given proposition (in our terminology, those state-descriptions in which a given sentence holds) are called the truth-grounds of that proposition (∗5.101). “If Tr is the number of the truth-grounds of the proposition “r”, Trs the number of those truth-grounds of the proposition “s” which are at the same time truth-grounds of “r”, then we call the ratio Trs:Tr the measure of the probability which the proposition “r” gives to the proposition s “(∗5.15). It seems that the concept of probability thus defined coincides with the function cw.

14 The results are as follows.

Therefore (according to (5) in §6):

“The general theorem is as follows:

17 The general theorem is as follows:

18 Another theorem may be mentioned which deals with the case where, in distinction to the case just discussed, the evidence already gives some information about the individual c mentioned in h. Let M 1 be a factual elementary property with the width w 1 (w 1 ≥ 2); thus M 1 is a disjunction of w 1 Q-properties. Let M 2 be the disjunction of w 2 among those w 1 Q-properties (1 ≤ w 2 < w 1); hence M 2 L-implies M 1 and has the width w 2. e specifies first how the s individuals of an observed sample are distributed among certain properties, and, in particular, it says that s 1 of them have the property M 1 and s 2 of these s 1 individuals have also the property M 2; in addition, e says that c is M 1; and h says that c is also M 2. Then,

This is analogous to (1); but in the place of the whole sample we have here that part of it which shows the property M 1.

This theorem shows that the ratio of the increase of c* is greater than 1, since w 1 > w 2.

20 Janina Lindenbaum-Hosiasson, “Induction et analogie: Comparaison de leur fondement,” Mind, vol. 50, 1941, pp. 351–365; see especially pp. 361–365.

21 The general theorem is as follows:

Other theorems, which cannot be stated here, concern the case where more than two properties are involved, or give approximations for the frequent case where the whole population is very large in relation to the sample.

22 The general theorem is as follows:

In the special case of a language containing 'M 1’ as the only primitive predicate, we have w 1 = 1 and k = 2, and hence . The latter value is given by some authors as holding generally (see Jeffreys, op.cit., p. 106 (16)). However, it seems plausible that. the degree of confirmation must be smaller for a stronger law and hence depend upon w 1.

If s, and hence N, too, is very large in relation to k, the following holds as an approximation:

For the infinite language L we obtain, according to definition (7) in §3:

23 The theorem is as follows:

24 In technical terms, the definition is as follows:

, where h is an instance of l formed by the substitution of an individual constant not occurring in e.

25 The technical definition will be given here. Let l be ‘for every x, if x is M, then x is M″. Let l be non-L-false and without variables. Let ‘c’ be any individual constant not occurring in e; let j say that c is M, and h′ that c is M′. Then the qualified-instance confirmation of l with respect to ‘M’ and ‘M’ on the evidence e is defined as follows: .

26 Some of the theorems may here be given. Let the law l say, as above, that all M are M′. Let 'M 1’ be defined, as earlier, by ‘M·∼M’ (“non-white swan”) and ‘M2 by ‘M·M’ (“white swan”). Let the widths of M 1 and.M 2 be w 1 and w 2 respectively. Let e be a report about s observed individuals saying that s 1 of them are M 1 and s 2 are M 2, while the remaining ones are ∼M and hence neither M 1 nor M 2. Then the following holds:

The values of and for the case that the observed sample does not contain any individuals violating the law l can easily be obtained from the values stated in (1) and (2) by taking s 1 = 0.

27 E. Nagel, Principles of the Theory of Probability. Int. Encycl. of Unified Science, vol. I, No. 6, 1939; see pp. 68–71.

28 Hans Reichenbach, Experience and Prediction, 1938, §§38 ff., and earlier publications.