Book contents
- Frontmatter
- Dedication
- Contents
- Preface
- Introduction
- Part I Inverse Problems
- 1 Bayesian Inverse Problems andWell-Posedness
- 2 The Linear-Gaussian Setting
- 3 Optimization Perspective
- 4 Gaussian Approximation
- 5 Monte Carlo Sampling and Importance Sampling
- 6 Markov Chain Monte Carlo
- Exercises for Part I
- Part II Data Assimilation
- 7 Filtering and Smoothing Problems and Well-Posedness
- 8 The Kalman Filter and Smoother
- 9 Optimization for Filtering and Smoothing: 3DVAR and 4DVAR
- 10 The Extended and Ensemble Kalman Filters
- 11 Particle Filter
- 12 Optimal Particle Filter
- Exercises for Part II
- Part III Kalman Inversion
- 13 Blending Inverse Problems and Data Assimilation
- References
- Index
6 - Markov Chain Monte Carlo
Published online by Cambridge University Press: 27 July 2023
- Frontmatter
- Dedication
- Contents
- Preface
- Introduction
- Part I Inverse Problems
- 1 Bayesian Inverse Problems andWell-Posedness
- 2 The Linear-Gaussian Setting
- 3 Optimization Perspective
- 4 Gaussian Approximation
- 5 Monte Carlo Sampling and Importance Sampling
- 6 Markov Chain Monte Carlo
- Exercises for Part I
- Part II Data Assimilation
- 7 Filtering and Smoothing Problems and Well-Posedness
- 8 The Kalman Filter and Smoother
- 9 Optimization for Filtering and Smoothing: 3DVAR and 4DVAR
- 10 The Extended and Ensemble Kalman Filters
- 11 Particle Filter
- 12 Optimal Particle Filter
- Exercises for Part II
- Part III Kalman Inversion
- 13 Blending Inverse Problems and Data Assimilation
- References
- Index
Summary
In this chapter we study Markov chain Monte Carlo (MCMC), a methodology that delivers approximate samples from a given target distribution π. The methodology applies to settings in which π is the posterior distribution in (1.2), but it is also widely used in numerous applications beyond Bayesian inference. As with Monte Carlo and importance sampling, MCMC may be viewed as approximating the target distribution by a sum of Dirac masses, thus allowing the approximation of expectations with respect to the target. Implementation of Monte Carlo presupposes that independent samples from the target can be obtained. Importance sampling and MCMC bypass this restrictive assumption: importance sampling by appropriately weighting independent samples from a proposal distribution, and MCMC by drawing correlated samples from a Markov kernel that has the target as invariant distribution.
- Type
- Chapter
- Information
- Inverse Problems and Data Assimilation , pp. 73 - 90Publisher: Cambridge University PressPrint publication year: 2023