Hostname: page-component-586b7cd67f-r5fsc Total loading time: 0 Render date: 2024-11-24T20:45:30.299Z Has data issue: false hasContentIssue false

A note on acceptance rate criteria for CLTS for Metropolis–Hastings algorithms

Published online by Cambridge University Press:  14 July 2016

G. O. Roberts*
Affiliation:
Cambridge University
*
Postal address: Department of Mathematics and Statistics, Lancaster University, Lancaster LA1 4YF, UK. Email address: [email protected].

Abstract

This paper considers positive recurrent Markov chains where the probability of remaining in the current state is arbitrarily close to 1. Specifically, conditions are given which ensure the non-existence of central limit theorems for ergodic averages of functionals of the chain. The results are motivated by applications for Metropolis–Hastings algorithms which are constructed in terms of a rejection probability (where a rejection involves remaining at the current state). Two examples for commonly used algorithms are given, for the independence sampler and the Metropolis-adjusted Langevin algorithm. The examples are rather specialized, although, in both cases, the problems which arise are typical of problems commonly occurring for the particular algorithm being used.

Type
Short Communications
Copyright
Copyright © Applied Probability Trust 1999 

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

Besag, J. E., Green, P. J., Higdon, D., and Mengersen, K. L. (1995). Bayesian computation and stochastic systems (with discussion). Statist. Sci. 10, 366.Google Scholar
Chan, K. S., and Geyer, C. J. (1994). Discussion of [18]. Ann. Statist. 22, 17471758.Google Scholar
Feller, W. (1971). An Introduction toPprobability Theory and its Applications, Vol II, 2nd edn. Wiley, New York.Google Scholar
Gilks, W. R., Richardson, S., and Spiegelhalter, D. J. (1996). Markov Chain Monte Carlo in Practice. Chapman and Hall, London.Google Scholar
Hastings, W. K. (1970). Monte Carlo sampling methods using Markov chains and their applications. Biometrika 57, 97109.CrossRefGoogle Scholar
Kipnis, C., and Varadhan, S. R. S. (1986). Central limit theorem for additive functionals of reversible Markov processes and applications to simple exclusions. Comm. Math. Phys. 104, 119.CrossRefGoogle Scholar
Liu, J. S. (1997). Metropolized independent sampling with comparisons to rejection sampling and importance sampling. Statistics and Computing. 6, 113119.CrossRefGoogle Scholar
Mengersen, K. L., and Tweedie, R. L. (1996). Rates of convergence of the Hastings and Metropolis algorithms. Ann. Statist. 24, 101121.CrossRefGoogle Scholar
Metropolis, N., Rosenbluth, A., Rosenbluth, M., Teller, A., and Teller, E. (1953). Equations of state calculations by fast computing machines. J. Chem. Phys. 21, 10871091.CrossRefGoogle Scholar
Roberts, G. O. (1996). Markov chain concepts relating to sampling algorithms. In Markov Chain Monte Carlo in Practice. Chapman and Hall, London, pp. 4558.Google Scholar
Roberts, G. O., and Rosenthal, J. S. (1997). Geometric ergodicity and hybrid Markov chains. Electron. Commun. Prob. 2, 1325.CrossRefGoogle Scholar
Roberts, G. O., and Rosenthal, J. S. (1998). Optimal scaling of discrete approximations to Langevin diffusions. University of Cambridge Statistical Laboratory research report 94–11. J. Roy. Statist. Assoc. B. 60, 255268.CrossRefGoogle Scholar
Roberts, G. O., and Smith, A. F. M. (1994). Simple conditions for the convergence of the Gibbs sampler and Metropolis–Hastings algorithms. Stoch. Proc. Appl. 49, 207216.CrossRefGoogle Scholar
Roberts, G. O., and Tweedie, R. L. (1996). Geometric convergence and central limit theorems for multidimensional Hastings and Metropolis algorithms. Biometrika 83, 96110.CrossRefGoogle Scholar
Roberts, G. O., and Tweedie, R. L. (1996). Exponential convergence of Langevin diffusions and their discrete approximations. Bernoulli 2, 341363.CrossRefGoogle Scholar
Smith, A. F. M., and Roberts, G. O. (1993). Bayesian computation via the Gibbs sampler and related Markov chain Monte Carlo methods (with discussion). J. Roy. Statist. Soc. B 55, 324.Google Scholar
Smith, R. L., and Tierney, L. (1996). Exact transition probabilities for the independence Metropolis sampler. University of Cambridge Statistical Laboratory research report 96–16.Google Scholar
Tierney, L. (1994). Markov chains for exploring posterior distributions (with discussion). Ann. Statist. 22, 17011762.Google Scholar