Hostname: page-component-586b7cd67f-dlnhk Total loading time: 0 Render date: 2024-11-28T00:18:59.440Z Has data issue: false hasContentIssue false

On an entropy conservation principle

Published online by Cambridge University Press:  14 July 2016

Jérôme Manuceau*
Affiliation:
University of Antilles-Guyane
Marylène Troupé*
Affiliation:
University of Antilles-Guyane
Jean Vaillant*
Affiliation:
University of Antilles-Guyane
*
Postal address: UFR Sciences, Department of Mathematics, 97169 Pointe-à-Pitre, Guadeloupe, FWI.
Postal address: UFR Sciences, Department of Mathematics, 97169 Pointe-à-Pitre, Guadeloupe, FWI.
Postal address: UFR Sciences, Department of Mathematics, 97169 Pointe-à-Pitre, Guadeloupe, FWI.

Abstract

We present an entropy conservation principle applicable to either discrete or continuous variables which provides a useful tool for aggregating observations. The associated method of modality grouping transforms a variable Z1 into a new variable Z2 such that the mutual information I(Z2,Y) between Y, a variable of interest, and Z2 is equal to I(Z1,Y).

Type
Short Communications
Copyright
Copyright © Applied Probability Trust 1999 

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

Ebrahimi, N., and Pellerey, F. (1995). New partial ordering of survival functions based on the notion of uncertainty. J. Appl. Prob. 32, 202211.CrossRefGoogle Scholar
El Hasnaoui, A. (1993). Le concept du gain d'information: une nouvelle approche en épidémiologie quantitative. , Université de Montpellier.Google Scholar
Kullback, S. (1959). Information Theory and Statistics. Wiley, New York.Google Scholar
Manuceau, J., Troupé, M., and Vaillant, J. (1999). Information and prognostic value of some variables in the breast cancer. To appear in European Series in Applied and Industrial Mathematics.Google Scholar
Rao, C.R. (1982). Diversity: Its measurement, decomposition, apportionment and analysis. Sankhya Ser. A. 44, 122.Google Scholar
Rao, C. R. (1986). Generalization of ANOVA through entropy and cross entropy functions. In Probability Theory and Mathematical Statistics, Vol. 2. Sciences Press, Utrecht, pp. 477494.Google Scholar
Rényi, A. (1961). On measures of entropy and information. In Proc. 4th Berkeley Symposium on Mathematical Statistics and Probability, Vol. 1. University of California Press, Berkeley, CA, pp. 547561.Google Scholar
Robert, C. (1990). An entropy concentration theorem: applications in artificial intelligence and descriptive statistics. J. Appl. Prob. 27, 303313.Google Scholar
Shannon, C. E. (1948). A mathematical theory of communication. Bell System Technical Journal 27, 379423 & 623–656.CrossRefGoogle Scholar