Hostname: page-component-cd9895bd7-gxg78 Total loading time: 0 Render date: 2024-12-27T06:11:23.147Z Has data issue: false hasContentIssue false

On the time a Markov chain spends in a lumped state

Published online by Cambridge University Press:  14 July 2016

Tommy Norberg*
Affiliation:
Chalmers University
*
Postal address: Department of Mathematics, Chalmers University of Technology and Göteborg University, S-41296 Göteborg, Sweden.

Abstract

The sojourn time that a Markov chain spends in a subset E of its state space has a distribution that depends on the hitting distribution on E and the probabilities (resp. rates in the continuous-time case) that govern the transitions within E. In this note we characterise the set of all hitting distributions for which the sojourn time distribution is geometric (resp. exponential).

Type
Research Article
Copyright
Copyright © Applied Probability Trust 1997 

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

Footnotes

Research supported in part by the Swedish Natural Science Research Council.

Partly completed while visiting the Center for Stochastic Processes, University of North Carolina at Chapel Hill.

References

[1] Kemeny, J. G. and Snell, J. L. (1976) Finite Markov Chains. Springer, Berlin.Google Scholar
[2] Ledoux, J. (1993) A necessary condition for weak lumpability in finite Markov processes. Operat. Res. Lett. 13, 165168.Google Scholar
[3] Neuts, M. F. (1981) Matrix-Geometric Solutions in Stochastic Models: An Algorithmic Approach. Johns Hopkins University Press, Baltimore.Google Scholar
[4] Rubino, G. and Sericola, B. (1989) On weak lumpability in Markov chains. J. Appl. Prob. 26, 446457.Google Scholar
[5] Rubino, G. and Sericola, B. (1989) Sojourn times in finite Markov processes. J. Appl. Prob. 27, 744756.Google Scholar
[6] Rubino, G. and Sericola, B. (1991) A finite characterization of weak lumpable Markov processes. Part I: The discrete time case. Stoch. Proc. Appl. 38, 195204.Google Scholar
[7] Seneta, E. (1981) Non-Negative Matrices and Markov Chains. Springer, Berlin.Google Scholar