Book contents
- Frontmatter
- Dedication
- Contents
- Acknowledgements
- Note to the Reader
- Interdependence of Chapters
- Introduction
- 1 Fundamental Functional Equations
- 2 Shannon Entropy
- 3 Relative Entropy
- 4 Deformations of Shannon Entropy
- 5 Means
- 6 Species Similarity and Magnitude
- 7 Value
- 8 Mutual Information and Metacommunities
- 9 Probabilistic Methods
- 10 Information Loss
- 11 Entropy Modulo a Prime
- 12 The Categorical Origins of Entropy
- Appendix A The Categorical Origins of Entropy
- Appendix B Summary of Conditions
- References
- Index of Notation
- Index
3 - Relative Entropy
Published online by Cambridge University Press: 21 April 2021
- Frontmatter
- Dedication
- Contents
- Acknowledgements
- Note to the Reader
- Interdependence of Chapters
- Introduction
- 1 Fundamental Functional Equations
- 2 Shannon Entropy
- 3 Relative Entropy
- 4 Deformations of Shannon Entropy
- 5 Means
- 6 Species Similarity and Magnitude
- 7 Value
- 8 Mutual Information and Metacommunities
- 9 Probabilistic Methods
- 10 Information Loss
- 11 Entropy Modulo a Prime
- 12 The Categorical Origins of Entropy
- Appendix A The Categorical Origins of Entropy
- Appendix B Summary of Conditions
- References
- Index of Notation
- Index
Summary
We give an introduction to the concept of relative entropy (also called Kullback-Leibler divergence). We interpret relative entropy in terms of both coding and diversity, and sketch some connections with other subjects: Riemannian geometry (where relative entropy is infinitesimally a squared distance), measure theory, and statistics. We prove that relative entropy is uniquely characterized by a short list of properties.
Keywords
- Type
- Chapter
- Information
- Entropy and DiversityThe Axiomatic Approach, pp. 62 - 90Publisher: Cambridge University PressPrint publication year: 2021