Book contents
- Frontmatter
- Contents
- Foreword
- Introduction
- Acknowledgments
- 1 Probability basics
- 2 Probability distributions
- 3 Measuring information
- 4 Entropy
- 5 Mutual information and more entropies
- 6 Differential entropy
- 7 Algorithmic entropy and Kolmogorov complexity
- 8 Information coding
- 9 Optimal coding and compression
- 10 Integer, arithmetic, and adaptive coding
- 11 Error correction
- 12 Channel entropy
- 13 Channel capacity and coding theorem
- 14 Gaussian channel and Shannon–Hartley theorem
- 15 Reversible computation
- 16 Quantum bits and quantum gates
- 17 Quantum measurements
- 18 Qubit measurements, superdense coding, and quantum teleportation
- 19 Deutsch–Jozsa, quantum Fourier transform, and Grover quantum database search algorithms
- 20 Shor's factorization algorithm
- 21 Quantum information theory
- 22 Quantum data compression
- 23 Quantum channel noise and channel capacity
- 24 Quantum error correction
- 25 Classical and quantum cryptography
- Appendix A (Chapter 4) Boltzmann's entropy
- Appendix B (Chapter 4) Shannon's entropy
- Appendix C (Chapter 4) Maximum entropy of discrete sources
- Appendix D (Chapter 5) Markov chains and the second law of thermodynamics
- Appendix E (Chapter 6) From discrete to continuous entropy
- Appendix F (Chapter 8) Kraft–McMillan inequality
- Appendix G (Chapter 9) Overview of data compression standards
- Appendix H (Chapter 10) Arithmetic coding algorithm
- Appendix I (Chapter 10) Lempel–Ziv distinct parsing
- Appendix J (Chapter 11) Error-correction capability of linear block codes
- Appendix K (Chapter 13) Capacity of binary communication channels
- Appendix L (Chapter 13) Converse proof of the channel coding theorem
- Appendix M (Chapter 16) Bloch sphere representation of the qubit
- Appendix N (Chapter 16) Pauli matrices, rotations, and unitary operators
- Appendix O (Chapter 17) Heisenberg uncertainty principle
- Appendix P (Chapter 18) Two-qubit teleportation
- Appendix Q (Chapter 19) Quantum Fourier transform circuit
- Appendix R (Chapter 20) Properties of continued fraction expansion
- Appendix S (Chapter 20) Computation of inverse Fourier transform in the factorization of N = 21 through Shor's algorithm
- Appendix T (Chapter 20) Modular arithmetic and Euler's theorem
- Appendix U (Chapter 21) Klein's inequality
- Appendix V (Chapter 21) Schmidt decomposition of joint pure states
- Appendix W (Chapter 21) State purification
- Appendix X (Chapter 21) Holevo bound
- Appendix Y (Chapter 25) Polynomial byte representation and modular multiplication
- Index
- References
4 - Entropy
Published online by Cambridge University Press: 05 June 2012
- Frontmatter
- Contents
- Foreword
- Introduction
- Acknowledgments
- 1 Probability basics
- 2 Probability distributions
- 3 Measuring information
- 4 Entropy
- 5 Mutual information and more entropies
- 6 Differential entropy
- 7 Algorithmic entropy and Kolmogorov complexity
- 8 Information coding
- 9 Optimal coding and compression
- 10 Integer, arithmetic, and adaptive coding
- 11 Error correction
- 12 Channel entropy
- 13 Channel capacity and coding theorem
- 14 Gaussian channel and Shannon–Hartley theorem
- 15 Reversible computation
- 16 Quantum bits and quantum gates
- 17 Quantum measurements
- 18 Qubit measurements, superdense coding, and quantum teleportation
- 19 Deutsch–Jozsa, quantum Fourier transform, and Grover quantum database search algorithms
- 20 Shor's factorization algorithm
- 21 Quantum information theory
- 22 Quantum data compression
- 23 Quantum channel noise and channel capacity
- 24 Quantum error correction
- 25 Classical and quantum cryptography
- Appendix A (Chapter 4) Boltzmann's entropy
- Appendix B (Chapter 4) Shannon's entropy
- Appendix C (Chapter 4) Maximum entropy of discrete sources
- Appendix D (Chapter 5) Markov chains and the second law of thermodynamics
- Appendix E (Chapter 6) From discrete to continuous entropy
- Appendix F (Chapter 8) Kraft–McMillan inequality
- Appendix G (Chapter 9) Overview of data compression standards
- Appendix H (Chapter 10) Arithmetic coding algorithm
- Appendix I (Chapter 10) Lempel–Ziv distinct parsing
- Appendix J (Chapter 11) Error-correction capability of linear block codes
- Appendix K (Chapter 13) Capacity of binary communication channels
- Appendix L (Chapter 13) Converse proof of the channel coding theorem
- Appendix M (Chapter 16) Bloch sphere representation of the qubit
- Appendix N (Chapter 16) Pauli matrices, rotations, and unitary operators
- Appendix O (Chapter 17) Heisenberg uncertainty principle
- Appendix P (Chapter 18) Two-qubit teleportation
- Appendix Q (Chapter 19) Quantum Fourier transform circuit
- Appendix R (Chapter 20) Properties of continued fraction expansion
- Appendix S (Chapter 20) Computation of inverse Fourier transform in the factorization of N = 21 through Shor's algorithm
- Appendix T (Chapter 20) Modular arithmetic and Euler's theorem
- Appendix U (Chapter 21) Klein's inequality
- Appendix V (Chapter 21) Schmidt decomposition of joint pure states
- Appendix W (Chapter 21) State purification
- Appendix X (Chapter 21) Holevo bound
- Appendix Y (Chapter 25) Polynomial byte representation and modular multiplication
- Index
- References
Summary
The concept of entropy is central to information theory (IT). The name, of Greek origin (entropia, tropos), means turning point or transformation. It was first coined in 1864 by the physicist R. Clausius, who postulated the second law of thermodynamics. Among other implications, this law establishes the impossibility of perpetual motion, and also that the entropy of a thermally isolated system (such as our Universe) can only increase. Because of its universal implications and its conceptual subtlety, the word entropy has always been enshrouded in some mystery, even, as today, to large and educated audiences.
The subsequent works of L. Boltzmann, which set the grounds of statistical mechanics, made it possible to provide further clarifications of the definition of entropy, as a natural measure of disorder. The precursors and founders of the later information theory (L. Szilárd, H. Nyquist, R. Hartley, J. von Neumann, C. Shannon, E. Jaynes, and L. Brillouin) drew as many parallels between the measure of information (the uncertainty in communication-source messages) and physical entropy (the disorder or chaos within material systems). Comparing information with disorder is not at all intuitive. This is because information (as we conceive it) is pretty much the conceptual opposite of disorder! Even more striking is the fact that the respective formulations for entropy that have been successively made in physics and IT happen to match exactly. A legend has it that Shannon chose the word “entropy” from the following advice of his colleague von Neumann: “Call it entropy.
- Type
- Chapter
- Information
- Classical and Quantum Information TheoryAn Introduction for the Telecom Scientist, pp. 50 - 68Publisher: Cambridge University PressPrint publication year: 2009