In this paper we connect various topological and probabilistic forms of stability for discrete-time Markov chains. These include tightness on the one hand and Harris recurrence and ergodicity on the other. We show that these concepts of stability are largely equivalent for a major class of chains (chains with continuous components), or if the state space has a sufficiently rich class of appropriate sets (‘petite sets').
We use a discrete formulation of Dynkin's formula to establish unified criteria for these stability concepts, through bounding of moments of first entrance times to petite sets. This gives a generalization of Lyapunov–Foster criteria for the various stability conditions to hold. Under these criteria, ergodic theorems are shown to be valid even in the non-irreducible case. These results allow a more general test function approach for determining rates of convergence of the underlying distributions of a Markov chain, and provide strong mixing results and new versions of the central limit theorem and the law of the iterated logarithm.