No CrossRef data available.
Article contents
Information, Entropy and Inductive Logic
Published online by Cambridge University Press: 14 March 2022
Extract
It has been shown by several authors that in operations involving information a quantity appears which is the negative of the quantity usually defined as entropy in similar situations. This quantity ℜ = − KI has been termed “negentropy” and it has been shown that the negentropy of information and the physical entropy S are mirrorlike representations of the same train of events. In physical terminology the energy is degraded by an increase in entropy due to an increased randomness in the positions or velocities of components, wave functions, complexions in phase space; in informational terminology some information about the same components has been lost or the negentropy has been decreased. In equilibrium the system has for a given energy content maximum randomness (or minimum information is required). One consequence of this dual aspect was the idea to apply the methods of statistical mechanics to problems of communication and Brillouin showed that Fermi-Dirac statistics or generalized Fermi statistics are applicable for example to a transmission of signals such as telegrams.
- Type
- Research Article
- Information
- Copyright
- Copyright © Philosophy of Science Association 1954