Book contents
- Frontmatter
- Contents
- Foreword
- Introduction
- Acknowledgments
- 1 Probability basics
- 2 Probability distributions
- 3 Measuring information
- 4 Entropy
- 5 Mutual information and more entropies
- 6 Differential entropy
- 7 Algorithmic entropy and Kolmogorov complexity
- 8 Information coding
- 9 Optimal coding and compression
- 10 Integer, arithmetic, and adaptive coding
- 11 Error correction
- 12 Channel entropy
- 13 Channel capacity and coding theorem
- 14 Gaussian channel and Shannon–Hartley theorem
- 15 Reversible computation
- 16 Quantum bits and quantum gates
- 17 Quantum measurements
- 18 Qubit measurements, superdense coding, and quantum teleportation
- 19 Deutsch–Jozsa, quantum Fourier transform, and Grover quantum database search algorithms
- 20 Shor's factorization algorithm
- 21 Quantum information theory
- 22 Quantum data compression
- 23 Quantum channel noise and channel capacity
- 24 Quantum error correction
- 25 Classical and quantum cryptography
- Appendix A (Chapter 4) Boltzmann's entropy
- Appendix B (Chapter 4) Shannon's entropy
- Appendix C (Chapter 4) Maximum entropy of discrete sources
- Appendix D (Chapter 5) Markov chains and the second law of thermodynamics
- Appendix E (Chapter 6) From discrete to continuous entropy
- Appendix F (Chapter 8) Kraft–McMillan inequality
- Appendix G (Chapter 9) Overview of data compression standards
- Appendix H (Chapter 10) Arithmetic coding algorithm
- Appendix I (Chapter 10) Lempel–Ziv distinct parsing
- Appendix J (Chapter 11) Error-correction capability of linear block codes
- Appendix K (Chapter 13) Capacity of binary communication channels
- Appendix L (Chapter 13) Converse proof of the channel coding theorem
- Appendix M (Chapter 16) Bloch sphere representation of the qubit
- Appendix N (Chapter 16) Pauli matrices, rotations, and unitary operators
- Appendix O (Chapter 17) Heisenberg uncertainty principle
- Appendix P (Chapter 18) Two-qubit teleportation
- Appendix Q (Chapter 19) Quantum Fourier transform circuit
- Appendix R (Chapter 20) Properties of continued fraction expansion
- Appendix S (Chapter 20) Computation of inverse Fourier transform in the factorization of N = 21 through Shor's algorithm
- Appendix T (Chapter 20) Modular arithmetic and Euler's theorem
- Appendix U (Chapter 21) Klein's inequality
- Appendix V (Chapter 21) Schmidt decomposition of joint pure states
- Appendix W (Chapter 21) State purification
- Appendix X (Chapter 21) Holevo bound
- Appendix Y (Chapter 25) Polynomial byte representation and modular multiplication
- Index
21 - Quantum information theory
Published online by Cambridge University Press: 05 June 2012
- Frontmatter
- Contents
- Foreword
- Introduction
- Acknowledgments
- 1 Probability basics
- 2 Probability distributions
- 3 Measuring information
- 4 Entropy
- 5 Mutual information and more entropies
- 6 Differential entropy
- 7 Algorithmic entropy and Kolmogorov complexity
- 8 Information coding
- 9 Optimal coding and compression
- 10 Integer, arithmetic, and adaptive coding
- 11 Error correction
- 12 Channel entropy
- 13 Channel capacity and coding theorem
- 14 Gaussian channel and Shannon–Hartley theorem
- 15 Reversible computation
- 16 Quantum bits and quantum gates
- 17 Quantum measurements
- 18 Qubit measurements, superdense coding, and quantum teleportation
- 19 Deutsch–Jozsa, quantum Fourier transform, and Grover quantum database search algorithms
- 20 Shor's factorization algorithm
- 21 Quantum information theory
- 22 Quantum data compression
- 23 Quantum channel noise and channel capacity
- 24 Quantum error correction
- 25 Classical and quantum cryptography
- Appendix A (Chapter 4) Boltzmann's entropy
- Appendix B (Chapter 4) Shannon's entropy
- Appendix C (Chapter 4) Maximum entropy of discrete sources
- Appendix D (Chapter 5) Markov chains and the second law of thermodynamics
- Appendix E (Chapter 6) From discrete to continuous entropy
- Appendix F (Chapter 8) Kraft–McMillan inequality
- Appendix G (Chapter 9) Overview of data compression standards
- Appendix H (Chapter 10) Arithmetic coding algorithm
- Appendix I (Chapter 10) Lempel–Ziv distinct parsing
- Appendix J (Chapter 11) Error-correction capability of linear block codes
- Appendix K (Chapter 13) Capacity of binary communication channels
- Appendix L (Chapter 13) Converse proof of the channel coding theorem
- Appendix M (Chapter 16) Bloch sphere representation of the qubit
- Appendix N (Chapter 16) Pauli matrices, rotations, and unitary operators
- Appendix O (Chapter 17) Heisenberg uncertainty principle
- Appendix P (Chapter 18) Two-qubit teleportation
- Appendix Q (Chapter 19) Quantum Fourier transform circuit
- Appendix R (Chapter 20) Properties of continued fraction expansion
- Appendix S (Chapter 20) Computation of inverse Fourier transform in the factorization of N = 21 through Shor's algorithm
- Appendix T (Chapter 20) Modular arithmetic and Euler's theorem
- Appendix U (Chapter 21) Klein's inequality
- Appendix V (Chapter 21) Schmidt decomposition of joint pure states
- Appendix W (Chapter 21) State purification
- Appendix X (Chapter 21) Holevo bound
- Appendix Y (Chapter 25) Polynomial byte representation and modular multiplication
- Index
Summary
This chapter sets the basis of quantum information theory (QIT). The central purpose of QIT is to qualify the transmission of either classical or quantum information over quantum channels. The starting point of the QIT description is von Neumann entropy, S(ρ), which represents the quantum counterpart of Shannon's classical entropy, H(X). Such a definition rests on that of the density operator (or density matrix) of a quantum system, ρ, which plays a role similar to that of the random-events source X in Shannon's theory. As we shall see, there also exists an elegant and one-to-one correspondence between the quantum and classical definitions of the entropy variants relative entropy, joint entropy, conditional entropy, and mutual information. But such a similarity is only apparent. Indeed, one becomes rapidly convinced from a systematic analysis of the entropy's additivity rules that fundamental differences separate the two worlds. The classical notion of information correlation between two event sources for quantum states shall be referred to as quantum entanglement. We then define a quantum communication channel, which encodes and decodes classical information into or from quantum states. The analysis shows that the mutual information H(X;Y) between originator and recipient in this communication channel cannot exceed a quantity χ, called the Holevo bound, which itself satisfies χ ≤ H(X), where H(X) is the entropy of the originator's classical information source.
- Type
- Chapter
- Information
- Classical and Quantum Information TheoryAn Introduction for the Telecom Scientist, pp. 431 - 456Publisher: Cambridge University PressPrint publication year: 2009