
Book contents
- Frontmatter
- Contents
- Preface
- Acknowledgements
- 1 Role of probability theory in science
- 2 Probability theory as extended logic
- 3 The how-to of Bayesian inference
- 4 Assigning probabilities
- 5 Frequentist statistical inference
- 6 What is a statistic?
- 7 Frequentist hypothesis testing
- 8 Maximum entropy probabilities
- 9 Bayesian inference with Gaussian errors
- 10 Linear model fitting (Gaussian errors)
- 11 Nonlinear model fitting
- 12 Markov chain Monte Carlo
- 13 Bayesian revolution in spectral analysis
- 14 Bayesian inference with Poisson sampling
- Appendix A Singular value decomposition
- Appendix B Discrete Fourier Transforms
- Appendix C Difference in two samples
- Appendix D Poisson ON/OFF details
- Appendix E Multivariate Gaussian from maximum entropy
- References
- Index
3 - The how-to of Bayesian inference
Published online by Cambridge University Press: 05 September 2012
- Frontmatter
- Contents
- Preface
- Acknowledgements
- 1 Role of probability theory in science
- 2 Probability theory as extended logic
- 3 The how-to of Bayesian inference
- 4 Assigning probabilities
- 5 Frequentist statistical inference
- 6 What is a statistic?
- 7 Frequentist hypothesis testing
- 8 Maximum entropy probabilities
- 9 Bayesian inference with Gaussian errors
- 10 Linear model fitting (Gaussian errors)
- 11 Nonlinear model fitting
- 12 Markov chain Monte Carlo
- 13 Bayesian revolution in spectral analysis
- 14 Bayesian inference with Poisson sampling
- Appendix A Singular value decomposition
- Appendix B Discrete Fourier Transforms
- Appendix C Difference in two samples
- Appendix D Poisson ON/OFF details
- Appendix E Multivariate Gaussian from maximum entropy
- References
- Index
Summary
Overview
The first part of this chapter is devoted to a brief description of the methods and terminology employed in Bayesian inference and can be read as a stand-alone introduction on how to do Bayesian analysis. Following a review of the basics in Section 3.2, we consider the two main inference problems: parameter estimation and model selection. This includes how to specify credible regions for parameters and how to eliminate nuisance parameters through marginalization. We also learn that Bayesian model comparison has a built-in “Occam's razor,” which automatically penalizes complicated models, assigning them large probabilities only if the complexity of the data justifies the additional complication of the model. We also learn how this penalty arises through marginalization and depends both on the number of parameters and the prior ranges of these parameters.
We illustrate these features with a detailed analysis of a toy spectral line problem and in the process introduce the Jeffreys prior and learn how different choices of priors affect our conclusions. We also have a look at a general argument for selecting priors for location and scale parameters in the early phases of an investigation when our state of ignorance is very high. The final section illustrates how Bayesian analysis provides valuable new insights on systematic errors and how to deal with them.
I recommend that Sections 3.2 to 3.5 of this chapter be read twice; once quickly, and again after seeing these ideas applied in the detailed example treated in Sections 3.6 to 3.11.
- Type
- Chapter
- Information
- Bayesian Logical Data Analysis for the Physical SciencesA Comparative Approach with Mathematica® Support, pp. 41 - 71Publisher: Cambridge University PressPrint publication year: 2005
- 3
- Cited by