Book contents
- Frontmatter
- Contents
- 0 Introduction
- 1 What is Fisher information?
- 2 Fisher information in a vector world
- 3 Extreme physical information
- 4 Derivation of relativistic quantum mechanics
- 5 Classical electrodynamics
- 6 The Einstein field equation of general relativity
- 7 Classical statistical physics
- 8 Power spectral 1 / ƒ noise
- 9 Physical constants and the 1/x probability law
- 10 Constrained-likelihood quantum measurement theory
- 11 Research topics
- 12 EPI and entangled realities: the EPR–Bohm experiment
- 13 Econophysics, with Raymond J. Hawkins
- 14 Growth and transport processes
- 15 Cancer growth, with Robert A. Gatenby
- 16 Summing up
- Appendix A Solutions common to entropy and Fisher I-extremization
- Appendix B Cramer–Rao inequalities for vector data
- Appendix C Cramer–Rao inequality for an imaginary parameter
- Appendix D EPI derivations of Schrödinger wave equation, Newtonian mechanics, and classical virial theorem
- Appendix E Factorization of the Klein–Gordon information
- Appendix F Evaluation of certain integrals
- Appendix G Schrödinger wave equation as a non-relativistic limit
- Appendix H Non-uniqueness of potential A for finite boundaries
- Appendix I Four-dimensional normalization
- Appendix J Transfer matrix method
- Appendix K Numerov method
- References
- Index
1 - What is Fisher information?
Published online by Cambridge University Press: 03 February 2010
- Frontmatter
- Contents
- 0 Introduction
- 1 What is Fisher information?
- 2 Fisher information in a vector world
- 3 Extreme physical information
- 4 Derivation of relativistic quantum mechanics
- 5 Classical electrodynamics
- 6 The Einstein field equation of general relativity
- 7 Classical statistical physics
- 8 Power spectral 1 / ƒ noise
- 9 Physical constants and the 1/x probability law
- 10 Constrained-likelihood quantum measurement theory
- 11 Research topics
- 12 EPI and entangled realities: the EPR–Bohm experiment
- 13 Econophysics, with Raymond J. Hawkins
- 14 Growth and transport processes
- 15 Cancer growth, with Robert A. Gatenby
- 16 Summing up
- Appendix A Solutions common to entropy and Fisher I-extremization
- Appendix B Cramer–Rao inequalities for vector data
- Appendix C Cramer–Rao inequality for an imaginary parameter
- Appendix D EPI derivations of Schrödinger wave equation, Newtonian mechanics, and classical virial theorem
- Appendix E Factorization of the Klein–Gordon information
- Appendix F Evaluation of certain integrals
- Appendix G Schrödinger wave equation as a non-relativistic limit
- Appendix H Non-uniqueness of potential A for finite boundaries
- Appendix I Four-dimensional normalization
- Appendix J Transfer matrix method
- Appendix K Numerov method
- References
- Index
Summary
Knowledge of Fisher information is not part of the educational background of most scientists. Why should they bother to learn about this concept? Surely the (related) concept of entropy is sufficient to describe the degree of disorder of a given phenomenon. These important questions may be answered as follows.
(a) The point made about entropy is true, but does not go far enough. Why not seek a measure of disorder whose variation derives the phenomenon? The concept of entropy cannot do this, for reasons discussed in Sec. 1.3. Fisher information will turn out to be the appropriate measure of disorder for this purpose.
(b) Why should scientists bother to learn this concept? Aside from the partial answer in (a): (i) Fisher information is a simple and intuitive concept. As theories go, it is quite elementary. To understand it does not require mathematics beyond differential equations. Even no prior knowledge of statistics is needed: this is easy enough to learn “on the fly”. The derivation of the defining property of Fisher information, in Sec. 1.2.3, is readily understood. (ii) The subject has very little specialized jargon or notation. The beginner does not need a glossary of terms and symbols to aid in its understanding. (iii) Most importantly, once understood, the concept gives strong payoff – one might call it “phenomen-all” – in scope of application. It's simply worth learning.
Fisher information has two basic roles to play in theory. First, it is a measure of the ability to estimate a parameter; this makes it a cornerstone of the statistical field of study called parameter estimation.
- Type
- Chapter
- Information
- Science from Fisher InformationA Unification, pp. 23 - 57Publisher: Cambridge University PressPrint publication year: 2004
- 1
- Cited by