Part I - Coding and information
Published online by Cambridge University Press: 05 July 2012
Summary
The word “information” has several meanings, the simplest of which has been formalized for communication by Hartley [17] as the logarithm of the number of elements in a finite set. Hence the information in the set A = {a, b, c, d} is log 4 = 2 (bits) in the base 2 logarithm. Only two types of logarithm are used in this book: the base 2 logarithm which we write simply as log, and the natural logarithm, written as ln. The amount of information in the set A = {a, b, c, d, e, f, g, h} is three bits, and so on. Hence such a formalization has nothing to do with any other meaning of the information communicated, such as the utility or quality. If we were asked to describe one element, say c in the second set, we could do it by saying that it is the third element, which could be done with the binary number 11 in both sets, but the element f in the second set would require three bits, namely 011. So we see that if the number of elements in a set is not a power of 2, we need either the maximum ∣log ∣A∣ number of bits or one less, as will be explained in the next section. Hence, we start seeing that “information,” relative to a set, could be formalized and measured by the shortest code length with which any element in a set could be described.
- Type
- Chapter
- Information
- Optimal Estimation of Parameters , pp. 9 - 10Publisher: Cambridge University PressPrint publication year: 2012
- 3
- Cited by