Book contents
- Frontmatter
- Contents
- List of contributors
- Preface
- 1 Inference and estimation in probabilistic time series models
- I Monte Carlo
- II Deterministic approximations
- III Switching models
- IV Multi-object models
- V Nonparametric models
- 14 Markov chain Monte Carlo algorithms for Gaussian processes
- 15 Nonparametric hidden Markov models
- 16 Bayesian Gaussian process models for multi-sensor time series prediction
- VI Agent-based models
- Index
- Plate section
- References
14 - Markov chain Monte Carlo algorithms for Gaussian processes
from V - Nonparametric models
Published online by Cambridge University Press: 07 September 2011
- Frontmatter
- Contents
- List of contributors
- Preface
- 1 Inference and estimation in probabilistic time series models
- I Monte Carlo
- II Deterministic approximations
- III Switching models
- IV Multi-object models
- V Nonparametric models
- 14 Markov chain Monte Carlo algorithms for Gaussian processes
- 15 Nonparametric hidden Markov models
- 16 Bayesian Gaussian process models for multi-sensor time series prediction
- VI Agent-based models
- Index
- Plate section
- References
Summary
Introduction
Gaussian processes (GPs) have a long history in statistical physics and mathematical probability. Two of the most well-studied stochastic processes, Brownian motion [12, 47] and the Ornstein–Uhlenbeck process [43], are instances of GPs. In the context of regression and statistical learning, GPs have been used extensively in applications that arise in geostatistics and experimental design [26, 45, 7, 40]. More recently, in the machine learning literature, GPs have been considered as general estimation tools for solving problems such as non-linear regression and classification [29]. In the context of machine learning, GPs offer a flexible nonparametric Bayesian framework for estimating latent functions from data and they share similarities with neural networks [23] and kernel methods [35].
In standard GP regression, where the likelihood is Gaussian, the posterior over the latent function (given data and hyperparameters) is described by a new GP that is obtained analytically. In all other cases, where the likelihood function is non-Gaussian, exact inference is intractable and approximate inference methods are needed. Deterministic approximate methods are currently widely used for inference in GP models [48, 16, 8, 29, 19, 34]. However, they are limited by an assumption that the likelihood function factorises. In addition, these methods usually treat the hyperparameters of the model (the parameters that appear in the likelihood and the kernel function) in a non full Bayesian way by providing only point estimates.
- Type
- Chapter
- Information
- Bayesian Time Series Models , pp. 295 - 316Publisher: Cambridge University PressPrint publication year: 2011
References
- 5
- Cited by