Published online by Cambridge University Press: 01 April 2022
This paper contrasts two information-theoretic approaches to statistical explanation: namely, (1) an analysis, which originated in my earlier research on problems of testing stochastic models of learning, based on an entropy-like measure of expected transmitted-information (and here referred to as the Expected-Information Model), and (2) the analysis, which was proposed by James Greeno (and which is closely related to Wesley Salmon's Statistical Relevance Model), based on the information-transmitted-by-a-system. The substantial differences between these analyses can be traced to the following basic difference. On Greeno's view, the essence of explanation lies in the relevance relations expressed by the conditional probabilities that relate the explanans variables to the explanandum variables; on my view, in contrast, the essence of explanation lies in theories viewed as hypothetical structures which deductively entail conditional probability distributions linking the explanans variables and the explanandum variables. The explanatory power of a stochastic theory is identified with information (regarding the values of explanandum variables) which is “absorbed from” the explanans variables. While other information which is “absorbed from” the explanandum variables (through the process of parameter estimation, for example) reflects descriptive power of the theory. I prove that Greeno's measure of transmitted information is a limiting special case of the E-I model, but that the former, unlike the latter, makes no distinction between explanatory power and descriptive power.
I wish again to express my indebtedness to my teacher, Ernest Adams, for a conversation in the Spring of 1964 during which he suggested the possibility of using transmitted information as a measure of predictive success for stochastic theories. His suggestions were elaborated and extended as part of my Ph.D. dissertation [10], directed by Adams and submitted to the faculty of the program in Logic and Methodology of Science at the University of California, Berkeley. Further applications of transmitted information as a measure of predictive, descriptive, and explanatory power were reported in [6], [7], [9], and [11].