Book contents
- Frontmatter
- Dedication
- Contents
- Acknowledgements
- Note to the Reader
- Interdependence of Chapters
- Introduction
- 1 Fundamental Functional Equations
- 2 Shannon Entropy
- 3 Relative Entropy
- 4 Deformations of Shannon Entropy
- 5 Means
- 6 Species Similarity and Magnitude
- 7 Value
- 8 Mutual Information and Metacommunities
- 9 Probabilistic Methods
- 10 Information Loss
- 11 Entropy Modulo a Prime
- 12 The Categorical Origins of Entropy
- Appendix A The Categorical Origins of Entropy
- Appendix B Summary of Conditions
- References
- Index of Notation
- Index
10 - Information Loss
Published online by Cambridge University Press: 21 April 2021
- Frontmatter
- Dedication
- Contents
- Acknowledgements
- Note to the Reader
- Interdependence of Chapters
- Introduction
- 1 Fundamental Functional Equations
- 2 Shannon Entropy
- 3 Relative Entropy
- 4 Deformations of Shannon Entropy
- 5 Means
- 6 Species Similarity and Magnitude
- 7 Value
- 8 Mutual Information and Metacommunities
- 9 Probabilistic Methods
- 10 Information Loss
- 11 Entropy Modulo a Prime
- 12 The Categorical Origins of Entropy
- Appendix A The Categorical Origins of Entropy
- Appendix B Summary of Conditions
- References
- Index of Notation
- Index
Summary
Any deterministic process loses information, and one can quantify the amount of information lost. Information loss is a generalization of entropy, and in some ways is a better-behaved quantity, being more functorial. We give a simple axiomatic characterization of information loss.
Keywords
- Type
- Chapter
- Information
- Entropy and DiversityThe Axiomatic Approach, pp. 329 - 342Publisher: Cambridge University PressPrint publication year: 2021