Hostname: page-component-78c5997874-v9fdk Total loading time: 0 Render date: 2024-11-05T03:29:19.343Z Has data issue: false hasContentIssue false

Stability of Markovian processes I: criteria for discrete-time Chains

Published online by Cambridge University Press:  01 July 2016

Sean P. Meyn*
Affiliation:
University of Illinois
R. L. Tweedie
Affiliation:
Bond University
*
Postal address: Coordinated Science Laboratory, University of Illinois, 1101 West Springfield Ave, Urbana, IL 61801, USA.

Abstract

In this paper we connect various topological and probabilistic forms of stability for discrete-time Markov chains. These include tightness on the one hand and Harris recurrence and ergodicity on the other. We show that these concepts of stability are largely equivalent for a major class of chains (chains with continuous components), or if the state space has a sufficiently rich class of appropriate sets (‘petite sets').

We use a discrete formulation of Dynkin's formula to establish unified criteria for these stability concepts, through bounding of moments of first entrance times to petite sets. This gives a generalization of Lyapunov–Foster criteria for the various stability conditions to hold. Under these criteria, ergodic theorems are shown to be valid even in the non-irreducible case. These results allow a more general test function approach for determining rates of convergence of the underlying distributions of a Markov chain, and provide strong mixing results and new versions of the central limit theorem and the law of the iterated logarithm.

Type
Research Article
Copyright
Copyright © Applied Probability Trust 1992 

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

Footnotes

∗∗

Present address: Department of Statistics, Colorado State University, Fort Collins, Colorado, CO 80523, USA.

This work was begun while the first author was visiting Bond University and developed there, at the Australian National University, and the University of Illinois. Research supported in part by the NSF initiation grant No ECS 8910088.

References

[1] Athreya, K. B. and Ney, P. (1978) A new approach to the limit theory of recurrent Markov chains. Trans. Amer. Math. Soc. 245, 493501.CrossRefGoogle Scholar
[2] Athreya, K. B. and Ney, P. (1980) Some aspects of ergodic theory and laws of large numbers for Harris recurrent Markov chains. Colloq. Math. Soc. J. Bolyai 32, 4156.Google Scholar
[3] Athreya, K. B. and Pantula, S. G. (1986) Mixing properties of Harris chains and autoregressive processes. J. Appl. Prob. 23, 880892.CrossRefGoogle Scholar
[4] Bhatia, N. P. and Szegö, G. P. (1970) Stability Theory of Dynamical Systems. Springer-Verlag, Berlin.CrossRefGoogle Scholar
[5] Billingsley, P. (1968) Convergence of Probability Measures. Wiley, New York.Google Scholar
[6] Chan, K. S. (1989) A note on the geometric ergodicity of a Markov chain. Adv. Appl. Prob. 21, 702704.CrossRefGoogle Scholar
[7] Chung, K. L. (1967) Markov Chains with Stationary Transition Probabilities , 2nd edn. Springer-Verlag, Berlin.Google Scholar
[8] Cogburn, R. (1972) The central limit theory for Markov processes. Proc. 6th Berkeley Symp. Math. Statist. Prob. , pp. 485512.Google Scholar
[9] Cogburn, R. (1975) A uniform theory for sums of Markov chain transition probabilities. Ann. Prob. 3, 191214.CrossRefGoogle Scholar
[10] Diebolt, J. and Guegan, D. (1990) Probability properties of the general non-linear Markovian process of order one and applications to time series modelling. Technical 125, Laboratoire de Statistique Théorique et Appliquée, CNRS-URA, Novembre 1990.Google Scholar
[11] Doob, J. L. (1953) Stochastic Processes. Wiley, New York.Google Scholar
[12] Foguel, S. R. (1969) Positive operators on C(X). Proc. Amer. Math. Soc. 22, 295297.Google Scholar
[13] Foster, F. G. (1953) On the stochastic matrices associated with certain queueing processes. Ann. Math. Statist. 24, 355360.Google Scholar
[14] Has'Minskii, R. Z. (1980) Stochastic Stability of Differential Equations. Sijthoff & Noordhoff, Alphen an den Rijn, The Netherlands.CrossRefGoogle Scholar
[15] Kelley, J. L. (1955) General Topology. Van Nostrand, Princeton, NJ.Google Scholar
[16] Lamperti, J. (1960) Criteria for the recurrence or transience of stochastic processes I. J. Math. Anal. Appl. 1, 314330.CrossRefGoogle Scholar
[17] Meyn, S. P. (1989) Ergodic theorems for discrete time stochastic systems using a stochastic Lyapunov function. SIAM J. Control Optim. 27, 14091439.CrossRefGoogle Scholar
[18] Meyn, S. P. and Caines, P. E. (1991) Asymptotic behavior of stochastic systems possessing Markovian realizations. SIAM J. Control Optim. 29, 535561.Google Scholar
[19] Meyn, S. P. and Guo, L. (1990) Adaptive control of time varying stochastic systems. In Proc. 11th IFAC World Cong. Tallinn, Estonia , ed. Utkin, V. and Jaaksoo, O., Vol. 3, pp. 198202.Google Scholar
[20] Meyn, S. P. and Guo, L. (1992) Geometric ergodicity of a bilinear time series model. J. Time Series Anal. Google Scholar
[21] Meyn, S. P. and Guo, L. (1992) Stability, convergence, and performance of an adaptive control algorithm applied to a randomly varying system. IEEE Trans. Autom. Control. 3, 535540.CrossRefGoogle Scholar
[22] Meyn, S. P. and Tweedie, R. L. (1992) Markov Chains and Stochastic Stability. Control and Communication in Engineering. Springer-Verlag, Berlin.Google Scholar
[23] Meyn, S. P. and Tweedie, R. L. (1992) Stability of Markovian processes II: Continuous time processes and sampled chains.Google Scholar
[24] Meyn, S. P. and Tweedie, R. L. (1992) Stability of Markovian processes III: Foster–Lyapunov criteria for continuous time processes.Google Scholar
[25] Nummelin, E. (1978) A splitting technique for Harris recurrent chains. Z. Wahrscheinlichkeitsth. 43, 309318.Google Scholar
[26] Nummelin, E. (1984) General Irreducible Markov Chains and Non-Negative Operators . Cambridge University Press.Google Scholar
[27] Nummelin, E. and Tuominen, P. (1982) Geometric ergodicity of Harris recurrent Markov chains with applications to renewal theory. Stoch. Proc. Appl. 12, 187202.Google Scholar
[28] Nummelin, E. and Tweedie, R. L. (1978) Geometric ergodicity and R-positivity for general Markov chains. Ann. Prob. 6, 404420.Google Scholar
[29] Orey, S. (1971) Limit Theorems for Markov Chain Transition Probabilities. Van Nostrand Reinhold Mathematical Studies 34, London.Google Scholar
[30] Pollard, D. B. and Tweedie, R. L. (1976) R-theory for Markov chains on a topological state space II. Z. Wahrscheinlichkeitsth. 34, 269278.Google Scholar
[31] Revuz, D. (1984) Markov Chains. North-Holland, Amsterdam.Google Scholar
[32] Spieksma, F. M. (1990) Geometrically Ergodic Markov Chains and the Optimal Control of Queues. Ph.D. Thesis, University of Leiden.Google Scholar
[33] Tuominen, P. and Tweedie, R. L. (1979) Markov chains with continuous components. Proc. London Math. Soc 3(38), 89114.CrossRefGoogle Scholar
[34] Tweedie, R. L. (1975) Sufficient conditions for ergodicity and recurrence of Markov chains on a general state space. Stoch. Proc. Appl. 3, 385403.Google Scholar
[35] Tweedie, R. L. (1976) Criteria for classifying general Markov chains. Adv. Appl. Prob. 8, 737771.Google Scholar
[36] Tweedie, R. L. (1983) Criteria for rates of convergence of Markov chains with application to queueing and storage theory. In Probability, Statistics and Analysis , ed. Kingman, J. F. C. and Reuter, G. E. H., London Mathematical Society Lecture Note Series, Cambridge University Press.Google Scholar
[37] Tweedie, R. L. (1988) Invariant measures for Markov chains with no irreducibility assumptions. J. Appl. Prob. 25A, 275285.CrossRefGoogle Scholar
[38] Vere-Jones, D. (1969) Some limit theorems for evanescent processes. Austral. J. Statist. 11, 6778.CrossRefGoogle Scholar