Book contents
- Frontmatter
- Contents
- Foreword
- Introduction
- Acknowledgments
- 1 Probability basics
- 2 Probability distributions
- 3 Measuring information
- 4 Entropy
- 5 Mutual information and more entropies
- 6 Differential entropy
- 7 Algorithmic entropy and Kolmogorov complexity
- 8 Information coding
- 9 Optimal coding and compression
- 10 Integer, arithmetic, and adaptive coding
- 11 Error correction
- 12 Channel entropy
- 13 Channel capacity and coding theorem
- 14 Gaussian channel and Shannon–Hartley theorem
- 15 Reversible computation
- 16 Quantum bits and quantum gates
- 17 Quantum measurements
- 18 Qubit measurements, superdense coding, and quantum teleportation
- 19 Deutsch–Jozsa, quantum Fourier transform, and Grover quantum database search algorithms
- 20 Shor's factorization algorithm
- 21 Quantum information theory
- 22 Quantum data compression
- 23 Quantum channel noise and channel capacity
- 24 Quantum error correction
- 25 Classical and quantum cryptography
- Appendix A (Chapter 4) Boltzmann's entropy
- Appendix B (Chapter 4) Shannon's entropy
- Appendix C (Chapter 4) Maximum entropy of discrete sources
- Appendix D (Chapter 5) Markov chains and the second law of thermodynamics
- Appendix E (Chapter 6) From discrete to continuous entropy
- Appendix F (Chapter 8) Kraft–McMillan inequality
- Appendix G (Chapter 9) Overview of data compression standards
- Appendix H (Chapter 10) Arithmetic coding algorithm
- Appendix I (Chapter 10) Lempel–Ziv distinct parsing
- Appendix J (Chapter 11) Error-correction capability of linear block codes
- Appendix K (Chapter 13) Capacity of binary communication channels
- Appendix L (Chapter 13) Converse proof of the channel coding theorem
- Appendix M (Chapter 16) Bloch sphere representation of the qubit
- Appendix N (Chapter 16) Pauli matrices, rotations, and unitary operators
- Appendix O (Chapter 17) Heisenberg uncertainty principle
- Appendix P (Chapter 18) Two-qubit teleportation
- Appendix Q (Chapter 19) Quantum Fourier transform circuit
- Appendix R (Chapter 20) Properties of continued fraction expansion
- Appendix S (Chapter 20) Computation of inverse Fourier transform in the factorization of N = 21 through Shor's algorithm
- Appendix T (Chapter 20) Modular arithmetic and Euler's theorem
- Appendix U (Chapter 21) Klein's inequality
- Appendix V (Chapter 21) Schmidt decomposition of joint pure states
- Appendix W (Chapter 21) State purification
- Appendix X (Chapter 21) Holevo bound
- Appendix Y (Chapter 25) Polynomial byte representation and modular multiplication
- Index
- References
7 - Algorithmic entropy and Kolmogorov complexity
Published online by Cambridge University Press: 05 June 2012
- Frontmatter
- Contents
- Foreword
- Introduction
- Acknowledgments
- 1 Probability basics
- 2 Probability distributions
- 3 Measuring information
- 4 Entropy
- 5 Mutual information and more entropies
- 6 Differential entropy
- 7 Algorithmic entropy and Kolmogorov complexity
- 8 Information coding
- 9 Optimal coding and compression
- 10 Integer, arithmetic, and adaptive coding
- 11 Error correction
- 12 Channel entropy
- 13 Channel capacity and coding theorem
- 14 Gaussian channel and Shannon–Hartley theorem
- 15 Reversible computation
- 16 Quantum bits and quantum gates
- 17 Quantum measurements
- 18 Qubit measurements, superdense coding, and quantum teleportation
- 19 Deutsch–Jozsa, quantum Fourier transform, and Grover quantum database search algorithms
- 20 Shor's factorization algorithm
- 21 Quantum information theory
- 22 Quantum data compression
- 23 Quantum channel noise and channel capacity
- 24 Quantum error correction
- 25 Classical and quantum cryptography
- Appendix A (Chapter 4) Boltzmann's entropy
- Appendix B (Chapter 4) Shannon's entropy
- Appendix C (Chapter 4) Maximum entropy of discrete sources
- Appendix D (Chapter 5) Markov chains and the second law of thermodynamics
- Appendix E (Chapter 6) From discrete to continuous entropy
- Appendix F (Chapter 8) Kraft–McMillan inequality
- Appendix G (Chapter 9) Overview of data compression standards
- Appendix H (Chapter 10) Arithmetic coding algorithm
- Appendix I (Chapter 10) Lempel–Ziv distinct parsing
- Appendix J (Chapter 11) Error-correction capability of linear block codes
- Appendix K (Chapter 13) Capacity of binary communication channels
- Appendix L (Chapter 13) Converse proof of the channel coding theorem
- Appendix M (Chapter 16) Bloch sphere representation of the qubit
- Appendix N (Chapter 16) Pauli matrices, rotations, and unitary operators
- Appendix O (Chapter 17) Heisenberg uncertainty principle
- Appendix P (Chapter 18) Two-qubit teleportation
- Appendix Q (Chapter 19) Quantum Fourier transform circuit
- Appendix R (Chapter 20) Properties of continued fraction expansion
- Appendix S (Chapter 20) Computation of inverse Fourier transform in the factorization of N = 21 through Shor's algorithm
- Appendix T (Chapter 20) Modular arithmetic and Euler's theorem
- Appendix U (Chapter 21) Klein's inequality
- Appendix V (Chapter 21) Schmidt decomposition of joint pure states
- Appendix W (Chapter 21) State purification
- Appendix X (Chapter 21) Holevo bound
- Appendix Y (Chapter 25) Polynomial byte representation and modular multiplication
- Index
- References
Summary
This chapter will take us into a world very different from all that we have seen so far concerning Shannon's information theory. As we shall see, it is a strange world made of virtual computers (universal Turing machines) and abstract axioms that can be demonstrated without mathematics merely by the force of logic, as well as relatively involved formalism. If the mere evocation of Shannon, of information theory, or of entropy may raise eyebrows in one's professional circle, how much more so that of Kolmogorov complexity! This chapter will remove some of the mystery surrounding “complexity,” also called “algorithmic entropy,” without pretending to uncover it all. Why address such a subject right here, in the middle of our description of Shannon's information theory? Because, as we shall see, algorithmic entropy and Shannon entropy meet conceptually at some point, to the extent of being asymptotically bounded, even if they come from totally uncorrelated basic assumptions! This remarkable convergence between fields must make integral part of our IT culture, even if this chapter will only provide a flavor. It may be perceived as being somewhat more difficult or demanding than the preceding chapters, but the extra investment, as we believe, is well worth it. In any case, this chapter can be revisited later on, should the reader prefer to keep focused on Shannon's theory and move directly to the next stage, without venturing into the intriguing sidetracks of algorithmic information theory.
- Type
- Chapter
- Information
- Classical and Quantum Information TheoryAn Introduction for the Telecom Scientist, pp. 96 - 126Publisher: Cambridge University PressPrint publication year: 2009