Consider a Markov chain with a finite number of states E1,…,En and a transition matrix, (Pij), of probabilities of transition from state Ej to state Ei which is such that after some fixed finite number of steps, any state is accessible from any initial state. Such a chain is said to be completely regular and it is easy to show that starting from any initial state, or distribution of states, the probabilities, , that the system is in state i after t steps, will converge to non-zero quantities, , which are independent of the initial state. This condition of accessibility corresponds, in statistical mechanics, to the condition of ‘metrical transitivity’.