Hostname: page-component-78c5997874-j824f Total loading time: 0 Render date: 2024-11-13T11:27:04.037Z Has data issue: false hasContentIssue false

A NOTE ON RÉNYI'S ENTROPY RATE FOR TIME-INHOMOGENEOUS MARKOV CHAINS

Published online by Cambridge University Press:  05 December 2018

Wenxi Li
Affiliation:
School of Mathematics and Physics, Anhui University of Technology Ma'anshan, 243002China E-mail: [email protected]
Zhongzhi Wang
Affiliation:
School of Mathematics and Physics, Anhui University of Technology Ma'anshan, 243002China E-mail: [email protected]

Abstract

In this note, we use the Perron–Frobenius theorem to obtain the Rényi's entropy rate for a time-inhomogeneous Markov chain whose transition matrices converge to a primitive matrix. As direct corollaries, we also obtain the Rényi's entropy rate for asymptotic circular Markov chain and the Rényi's divergence rate between two time-inhomogeneous Markov chains.

Type
Research Article
Copyright
Copyright © Cambridge University Press 2018 

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

1.Andai, A. (2009). On the geometry of generalized Gaussian distributions. Journal of Multivariate Analysis 100: 777793.Google Scholar
2.Bentes, S.R., Menezes, R. & Mendes, D.A. (2008). Long memory and volatility clustering: is the empirical evidence consistent across stock market?. Physica A: Statistical Mechanics and its Applications 387: 100, 38263830.Google Scholar
3.Bowerman, B. (1974) Nonstationary Markov decision processes and related topics in nonstationary Markov chains. Ph. D. thesis. Iowa State. University, Ames. Iowa.Google Scholar
4.Bowerman, B., David, H.T. & Isaacson, D. (1977). The convergence of cesaro averages for certain nonstationary Markov chains. Stochastic Processes & Their Applications 5: 221230.Google Scholar
5.Campbell, L.L. (1965). A coding theorem and Rényi entropy. Information & Control 8: 423429.Google Scholar
6.Chiang, T.S. & Chow, Y. S. (1989). A limit theorem for a class of inhomogeneous Markov processes. Annals of Probability 17: 14831502.Google Scholar
7.Csiszár, I. (1995). Generalized cutoff rates and Rényi's information measures. IEEE International Symposium on Information Theory 41: 2634.Google Scholar
8.De Gregorio, A. & Lacus, S. M. (2009). On Rényi information for ergodic diffusion process. Information Sciences 179(3): 279291.Google Scholar
9.Farhadi, A. & Charalambous, C. D. (2008). Robust coding for a class of sources: applications in control and reliable communication over limited capacity channels. Systems & Control Letters 57: 10051012.Google Scholar
10.Golshani, L., Pasha, E. & Yari, G. (2009). Some properties of Rényi entropy and Rényi entropy rate. Information Sciences 179(14): 24262433.Google Scholar
11.Hart, P. E. (1975). Moment distribution in economics: an exposition. Journal of the Royal Statistical Society 138(3): 423 434.Google Scholar
12.Jenssen, R. & Eltoft, T. (2008). A new information theoretic analysis of sum-of-squared-error kernel clustering. Neurocomputing 72: 2331.Google Scholar
13.Jizba, P., Kleinert, H. & Shefaat, M. (2012). Rényi's information transfer between financial time series. Physica A: Statistical Mechanics and its Applications 391: 29712989.Google Scholar
14.Koopmans, L.H. (1960). Asymptotic rate of discrimination for Markov process. Annals of Mathematical Statistics 31: 982994.Google Scholar
15.Lake, D.H. (2006). Rényi entropy, measures of heart rate gaussianity. IEEE Transactions on Biomedical Engineering 53: 2127.Google Scholar
16.Memetz, T. (1974). On the α-divergence rate for Markov-dependent hypothesis. Problems of Control and Information Theory 3(2): 147155.Google Scholar
17.Pronzato, L., Wynn, H.P. & Zhigljavsky, A.A. (1997). Using Rényi entropies to measure uncertainty in search problems. Lectures in Applied Mathematics 33: 253268.Google Scholar
18.Rached, Z., Alajai, F. & Campbell, L.L. (1999) Rényi's entropy rate for discrete Markov sources. in Proceedings CISS’99, Baltimore, MD, Mar 1719.Google Scholar
19.Rached, Z., Alajai, F. & Campbell, L.L. (2001). Rényi's divergence and entropy rate for finite alphabet Markov sources. IEEE Transactions on Information Theory 47(4): 15531561.Google Scholar
20.Rényi, A. (1961). On measure of entropy and information. Berkeley Symposium on Mathematical Statististics and Probability. Proceedings Fourth Berkeley Symposium on Mathematical Statististics and Probability (University of California Press) 1: 547561.Google Scholar
21.Roerdink, J.B.T.M. & Shuler, K. E. (1985). Asymptotic properties of multistate random walks. I. Theory. Journal of Statistical Physics 40: 205240.Google Scholar
22.Seneta, E. (Jan 2006). Non-negative matrices and Markov chains. 2nd Ed. New York, NY: Springer.Google Scholar
23.Shannon, C.A. (1948). A mathematical theorem of communication. Bell System Technical Journal 27: 379-423: 623656.Google Scholar
24.Vinga, S. (2014). Information theory applications for biological sequence analysis. Brief Bioinform 15(3): 376389.Google Scholar
25.Zhong, P.P., Yang, W.G. & Liang, P.P. (2010). The asymptotic equipartition property for asymptotic circular Markov chains. Probability in the Engineering and Informational Sciences 24: 279288.Google Scholar