Hostname: page-component-cd9895bd7-q99xh Total loading time: 0 Render date: 2024-12-29T07:19:38.209Z Has data issue: false hasContentIssue false

ON GENERALIZED CUMULATIVE ENTROPIES

Published online by Cambridge University Press:  13 June 2016

Suchandan Kayal*
Affiliation:
Department of Mathematics, National Institute of Technology Rourkela, Rourkela-769008, India E-mail: [email protected], [email protected]

Abstract

In the present paper, we introduce a generalization of the cumulative entropy proposed by Di Crescenzo and Longobardi [8]. This new notion is related to the lower records and the reversed relevation transform. Dynamic version of the newly proposed measure is considered. Several properties including the effect of linear transformations, a two-dimensional version, a normalized version, bounds, stochastic ordering, etc. are studied for the generalized cumulative entropy (GCE). Similar results are obtained for the dynamic GCE. Various relationships with other functions are derived. A class of distributions is introduced and several properties are studied. Finally, empirical GCE is proposed to estimate the newly proposed information measure.

Type
Research Article
Copyright
Copyright © Cambridge University Press 2016 

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

1. Arnold, B.C., Balakrishnan, N., & Nagaraja, H.N. (1998). Records. New York: Wiley.Google Scholar
2. Asadi, M. & Zohrevand, Y. (2007). On the dynamic cumulative residual entropy. Journal of Statistical Planning and Inference 137: 19311941.Google Scholar
3. Belzunce, F., Navarro, J., Ruiz, J.M., & Aguila, Y.D. (2004). Some results on residual entropy function. Metrika 59: 147161.Google Scholar
4. Block, H.W., Savits, T.H., & Singh, H. (1998). The reversed hazard rate function. Probability in Engineering and Informational Sciences 12: 6990.Google Scholar
5. Cover, T.M. & Thomas, J.A. (2006). Elements of information theory. New York: Wiley.Google Scholar
6. Di Crescenzo, A. (2000). Some results on the proportional reversed hazards model. Statistics and Probability Letters 50: 313321.Google Scholar
7. Di Crescenzo, A. & Longobardi, M. (2002). Entropy-based measure of uncertainty relies on the past. Journal of Applied Probability 39: 434440.Google Scholar
8. Di Crescenzo, A. & Longobardi, M. (2009). On cumulative entropies. Journal of Statistical Planning and Inference 139: 40724087.Google Scholar
9. Di Crescenzo, A. & Longobardi, M. (2012). Neuronal data analysis based on the empirical cumulative entropy. In Computer aided systems theory EUROCAST 2011. Berlin, Heidelberg: Springer, pp. 7279.Google Scholar
10. Di Crescenzo, A. & Toomaj, A. (2015). Extensions of the past lifetime and its connections to the cumulative entropy. Journal of Applied Probability 52: 11561174.Google Scholar
11. Ebrahimi, N. (1996). How to measure uncertainty about residual life time. Sankhya 58(A): 4857.Google Scholar
12. Ebrahimi, N. & Kirmani, S.N.U.A. (1996). Some results on ordering of survival functions through uncertainty. Statistics and Probability Letters 29: 167176.Google Scholar
13. Ebrahimi, N. & Pellerey, F. (1995). New partial ordering of survival functions based on notion of uncertainty. Journal of Applied Probability 32: 202211.Google Scholar
14. Gupta, R.C. & Gupta, R.D. (2007). Proportional reversed hazard rate model and its applications. Journal of Statistical Planning and Inference 137: 35253536.Google Scholar
15. Kapur, J.N. (1967). Generalized entropy of order α and type β. The Mathematics Seminar 4: 7896.Google Scholar
16. Kass, R.E., Ventura, V., & Cai, C. (2003). Statistical smoothing of neural data. Network-Computation in Neural Systems 14: 515.Google Scholar
17. Krakowski, M. (1973). The relevation transform and a generalization of the Gamma distribution function. Revue Francaise d'Automatique, Informatique et Recherche Opkrationnelle 7(2): 107120.Google Scholar
18. Kundu, C., Nanda A.K., & Maiti, S.S. (2010). Some distributional results through past entropy. Journal of Statistical Planning and Inference 140: 12801291.Google Scholar
19.Nanda A.K. & Paul, P. (2006). Some results on generalized past entropy. Journal of Statistical Planning and Inference 136: 36593674.Google Scholar
20. Navarro, J., Del Aguila, Y., & Asadi, M. (2010). Some new results on the cumulative residual entropy. Journal of Statistical Planning and Inference 140: 310322.Google Scholar
21. Psarrakos, G. & Navarro, J. (2013). Generalized cumulative residual entropy and record values. Metrika 27: 623640.Google Scholar
22. Pyke, R. (1965). Spacings. Journal of Royal Statistical Society, Series B 27: 395449.Google Scholar
23. Rao, M. (2005). More on a new concept of entropy and information. Journal of Theoretical Probability 18: 967981.Google Scholar
24. Rao, M., Chen, Y., Vernuri, B.C., & Wang, F. (2004). Cumulative residual entropy: a new measure of information. IEEE Transactions in Information Theory 50: 12201228.Google Scholar
25. Raqab, M.Z. & Asadi, M. (2010). Some results on the mean residual waiting time of records. Statistics 44: 493504.CrossRefGoogle Scholar
26. Renyi, A. (1961). On measures of entropy and information. In Proceedings of Fourth Berkeley Symposium on Mathematics, Statistics and Probability 1960. Berkeley: University of California Press, vol. 1, 547–461.Google Scholar
27. Sachlas, A. & Papaioannou, T. (2014). Residual and past entropy in actuarial science and survival models. Methodology and Computing in Applied Probability 16: 7999.Google Scholar
28. Sengupta, D. & Nanda, A.K. (1999). Log-concave and concave distributions in reliability. Naval Research Logistics 46: 419433.Google Scholar
29. Shaked, M. & Shanthikumar, J.G. (2007). Stochastic orders. New York: Springer-Verlag.CrossRefGoogle Scholar
30. Shannon, C. (1948). The mathematical theory of communication. Bell System Technical Journal 27: 379423.Google Scholar
31. Tsallis, C. (1988). Possible generalization of Boltzmann–Gibbs statistics. Journal of Statistical Physics 52: 479487.CrossRefGoogle Scholar
32. Varma, R.S. (1966). Generalization of Renyi's entropy of order α. Journal of Mathematical Sciences 1: 3448.Google Scholar
33. Wang, F. & Vemuri, B.C. (2007). Non-rigid multi-modal image registration using cross-cumulative residual entropy. International Journal of Computer Vision 74: 201215.Google Scholar
34. Wang, F., Vemuri, B.C., Rao, M., & Chen, Y. (2003a). A new and robust information theoretic measure and its application to image alignment. In Proceedings of the 18th International Conference on Information Processing in Medical Imaging (IPMI ’03), vol. 2732 of Lecture Notes in Computer Science. Ambleside, UK: Springer, July 2003, pp. 388400.Google Scholar
35. Wang, F., Vemuri, B.C., Rao, M., & Chen, Y. (2003b). Cumulative residual entropy, a new measure of information & its application to image alignment. In Proceedings on the Ninth IEEE International Conference on Computer Vision (ICCV’03). IEEE Computer Society, vol. 1, pp. 548553.Google Scholar