Hostname: page-component-78c5997874-xbtfd Total loading time: 0 Render date: 2024-11-08T19:32:51.645Z Has data issue: false hasContentIssue false

Complexity bounds for Markov chain Monte Carlo algorithms via diffusion limits

Published online by Cambridge University Press:  21 June 2016

Gareth O. Roberts*
Affiliation:
University of Warwick
Jeffrey S. Rosenthal*
Affiliation:
University of Toronto
*
* Postal address: Department of Statistics, University of Warwick, Coventry CV4 7AL, UK. Email address: [email protected]
** Postal address: Department of Statistics, University of Toronto, Toronto, Ontario, M5S 3G3, Canada. Email address: [email protected]

Abstract

We connect known results about diffusion limits of Markov chain Monte Carlo (MCMC) algorithms to the computer science notion of algorithm complexity. Our main result states that any weak limit of a Markov process implies a corresponding complexity bound (in an appropriate metric). We then combine this result with previously-known MCMC diffusion limit results to prove that under appropriate assumptions, the random-walk Metropolis algorithm in d dimensions takes O(d) iterations to converge to stationarity, while the Metropolis-adjusted Langevin algorithm takes O(d1/3) iterations to converge to stationarity.

Type
Research Papers
Copyright
Copyright © Applied Probability Trust 2016 

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

Aldous, D. and Fill, J. A. (2014).Reversible Markov chains and random walks on graphs. Unfinished monograph. Available at http://www.stat.berkeley.edu/~aldous/RWG/book.html.Google Scholar
Bédard, M. (2007).Weak convergence of Metropolis algorithms for non-i.i.d. target distributions.Ann. Appl. Prob. 17, 12221244.Google Scholar
Bédard, M. (2008).Optimal acceptance rates for Metropolis algorithms: Moving beyond 0.234.Stoch. Process. Appl. 118, 21982222.CrossRefGoogle Scholar
Brooks, S., Gelman, A., Jones, G. L. and Meng, X.-L. (eds) (2011).Handbook of Markov chain Monte Carlo.Chapman & Hall/CRC, Boca Raton, FL.CrossRefGoogle Scholar
Cobham, A. (1965).The intrinsic computational difficulty of functions. In Proceedings of the 1964 International Congress for Logic, Methodology, and Philosophy of Science, North-Holland, Amsterdam, pp.2430.Google Scholar
Cook, S. A. (1971).The complexity of theorem-proving procedures. In Proc. Third Annual ACM Symposium on Theory of Computing, ACM, New York, pp.151158.Google Scholar
Ethier, S. N. and Kurtz, T. G. (1986).Markov Processes: Characterization and Convergence.John Wiley, New York.Google Scholar
Gelman, A. and Rubin, D. B. (1992).Inference from iterative simulation using multiple sequences.Statist. Sci. 7, 457472.CrossRefGoogle Scholar
Givens, C. R. and Shortt, R. M. (1984).A class of Wasserstein metrics for probability distributions.Michigan Math. J. 31, 231240.CrossRefGoogle Scholar
Jones, G. L. and Hobert, J. P. (2001).Honest exploration of intractable probability distributions via Markov chain Monte Carlo.Statist. Sci. 16, 312334.Google Scholar
Jones, G. L. and Hobert, J. P. (2004).Sufficient burn-in for Gibbs samplers for a hierarchical random effects model.Ann. Statist. 32, 784817.CrossRefGoogle Scholar
Jourdain, B., Lelièvre, T. and Miasojedow, B. (2015).Optimal scaling for the transient phase of the Metropolis Hastings algorithm: The mean-field limit.Ann. Appl. Prob. 25, 22632300.Google Scholar
Jourdain, B., Lelièvre, T. and Miasojedow, B. (2014).Optimal scaling for the transient phase of Metropolis Hastings algorithms: the longtime behavior.Bernoulli 20, 19301978.Google Scholar
Kantorovič, L. and Rubinšteǐn, G. Š. (1958).On a space of completely additive functions.Vestnik Leningrad. Univ. 13, 5259.Google Scholar
Mengersen, K. L. and Tweedie, R. L. (1996).Rates of convergence of the Hastings and Metropolis algorithms.Ann. Statist. 24, 101121.CrossRefGoogle Scholar
Neal, P. and Roberts, G. (2006).Optimal scaling for partially updating MCMC algorithms.Ann. Appl. Prob. 16, 475515.CrossRefGoogle Scholar
Neal, P. and Roberts, G. (2008).Optimal scaling for random walk Metropolis on spherically constrained target densities.Methodol. Comput. Appl. Prob. 10, 277297.Google Scholar
Neal, P. and Roberts, G. (2011).Optimal scaling of random walk Metropolis algorithms with non-Gaussian proposals.Methodol. Comput. Appl. Prob. 13, 583601.Google Scholar
Neal, P., Roberts, G. and Yuen, W. K. (2012).Optimal scaling of random walk Metropolis algorithms with discontinuous target densities.Ann. Appl. Prob. 22, 18801927.Google Scholar
Roberts, G. O. (1998).Optimal Metropolis algorithms for product measures on the vertices of a hypercube.Stoch. Stoch. Reports 62, 275283.Google Scholar
Roberts, G. O. and Rosenthal, J. S. (1997).Geometric ergodicity and hybrid Markov chains.Electron. Commun. Prob. 2, 1325.CrossRefGoogle Scholar
Roberts, G. O. and Rosenthal, J. S. (1998).Optimal scaling of discrete approximations to Langevin diffusions.J. R. Statist. Soc. B 60, 255268.CrossRefGoogle Scholar
Roberts, G. O. and Rosenthal, J. S. (2001).Optimal scaling for various Metropolis–Hastings algorithms.Statist. Sci. 16, 351367.CrossRefGoogle Scholar
Roberts, G. O. and Rosenthal, J. S. (2004).General state space Markov chains and MCMC algorithms.Prob. Surv. 1, 2071.CrossRefGoogle Scholar
Roberts, G. O., Gelman, A. and Gilks, W. R. (1997).Weak convergence and optimal scaling of random walk Metropolis algorithms.Ann. Appl. Prob. 7, 110120.Google Scholar
Rosenthal, J. S. (1995a).Minorization conditions and convergence rates for Markov chain Monte Carlo.J. Amer. Statist. Assoc. 90, 558566, 1136.Google Scholar
Rosenthal, J. S. (1995b).Rates of convergence for Gibbs sampler for variance components models.Ann. Statist. 23, 740761.Google Scholar
Rosenthal, J. S. (1996).Analysis of the Gibbs sampler for a model related to James–Stein estimators.Statist. Comput. 6, 269275.Google Scholar
Rosenthal, J. S. (2000).A First Look at Rigorous Probability Theory.World Scientific, River Edge, NJ.Google Scholar
Rosenthal, J. S. (2002).Quantitative convergence rates of Markov chains: a simple account.Electron. Commun. Prob. 7, 123128.Google Scholar
Sherlock, C. and Roberts, G. O. (2009).Optimal scaling of the random walk Metropolis on elliptically symmetric unimodal targets.Bernoulli 15, 774798.CrossRefGoogle Scholar
Woodard, D. B., Schmidler, S. C. and Huber, M. L. (2009a).Conditions for rapid mixing of parallel and simulated tempering on multimodal distributions.Ann. Appl. Prob. 19, 617640.CrossRefGoogle Scholar
Woodard, D. B., Schmidler, S. C. and Huber, M. L. (2009b).Sufficient conditions for torpid mixing of parallel and simulated tempering.Electron. J. Prob. 14, 780804.CrossRefGoogle Scholar