Hostname: page-component-cd9895bd7-jn8rn Total loading time: 0 Render date: 2024-12-27T08:11:09.018Z Has data issue: false hasContentIssue false

ON CUMULATIVE RESIDUAL EXTROPY

Published online by Cambridge University Press:  17 May 2019

S. M. A. Jahanshahi
Affiliation:
Department of Statistics, University of Sistan and Baluchestan, Zahedan, Iran E-mail: [email protected]
H. Zarei
Affiliation:
Department of Statistics, University of Sistan and Baluchestan, Zahedan, Iran E-mail: [email protected]
A. H. Khammar
Affiliation:
Department of Statistics, University of Birjand, Iran

Abstract

Recently, an alternative measure of uncertainty called extropy is proposed by Lad et al. [12]. The extropy is a dual of entropy which has been considered by researchers. In this article, we introduce an alternative measure of uncertainty of random variable which we call it cumulative residual extropy. This measure is based on the cumulative distribution function F. Some properties of the proposed measure, such as its estimation and applications, are studied. Finally, some numerical examples for illustrating the theory are included.

Type
Research Article
Copyright
Copyright © Cambridge University Press 2019

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

1.Alizadeh, R., Alizadeh, H., & Ebrahimi, A. (2014). An entropy test for the rayleigh distribution and power comparison. Journal of Statistical Computation and Simulation. 84: 151158.CrossRefGoogle Scholar
2.Arnold, B.C., Balakrishnan, B.N., & Nagaraja, H.N. (1992). A first course in order statistics. New York: John Wiley and Sons.Google Scholar
3.Asadi, M. & Zohrevand, Y. (2007). On the dynamic cumulative residual entropy. Journal of Statistical Planning and Inference 137: 19311941.CrossRefGoogle Scholar
4.Balakrishanan, N., Leiva, V., Sanhuzea, A., & Cabrera, E. (2009). Mixture inverse gaussian distributions and its transformations, moments and applications. Statistics 43: 91104.CrossRefGoogle Scholar
5.Best, D.J., Rayner, J.C.W., & Thas, O. (2010). Easily applied tests of fit for the Rayleigh distribution. Sankhya B 72: 254263.CrossRefGoogle Scholar
6.Chhikara, R.S. & Folks, J.L. (1989). The inverse gaussian distribution: Theory, methodology, and applications, New York: Springer, Marcel Dekker.Google Scholar
7.David, H.A. & Nagaraja, H.N (2003). Order statistics. 3rd ed. New York: Wiley.CrossRefGoogle Scholar
8.Furuichi, S. & Mitroi, F.C. (2012). Mathematical inequalities for some divergences. Physica A, Elsevier 391: 388400.CrossRefGoogle Scholar
9.Gneiting, T. & Raftery, A.E. (2007). Strictly proper scoring rules, prediction and estimation. Journal of the American Statistical Association 102: 359378.CrossRefGoogle Scholar
10.Gupta, R.C. & Gupta, R.D. (2007). Proportional reversed Hazard rate model and its applications. Journal of Statistical Planning and Inference 137: 35253536.CrossRefGoogle Scholar
11.Jaynes, E.T. (2003). Probability theory: the logic of science. Cambridge: Cambridge Univ. Press.CrossRefGoogle Scholar
12.Lad, F., Sanfilippo, G., & Agro, G. (2015). Extropy: complementary dual of entropy. Statistical Science 30: 4058.CrossRefGoogle Scholar
13.Qiu, G. (2017). The extropy of order statistics and record values. Statistics and Probability Letters, Elsevier 120: 5260.CrossRefGoogle Scholar
14.Qiu, G. & Jia, K. (2018a). The residual extropy of order statistics. Statistics and Probability Letters, Elsevier 133, 1522.CrossRefGoogle Scholar
15.Qiu, G. & Jia, K. (2018b). Extropy estimators with applications in testing uniformity. Journal of Nonparametric Statistics, 30, 182196.CrossRefGoogle Scholar
16.Qiu, G., Wang, L., & Wang, X. (2018). On extropy properties of mixed systems. Probability in the Engineering and Informational Science 42, 116.Google Scholar
17.Ramsay, C.M. (1993). Loading Gross Premiums for Risk Without Using Utility Theory, Transactions of the Society of Actuaries XLV, 305349.Google Scholar
18.Rao, C.R. (1965). On discrete distributions arising out of methods of ascertainment. Sankhya A 27: 31324.Google Scholar
19.Rao, C.R. (1985). Weighted distributions arising out of methods of as certainment: what population does a sample represent?. NewYork: Springer-Verlag, 543569.Google Scholar
20.Rao, M., Chen, Y., Vemuri, B.C., & Wang, F. (2004). Cumulative residual entropy: a new measure of information. IEEE Transactions on Information Theory 50: 12201228.CrossRefGoogle Scholar
21.Rothschild, M. & Stiglitz, J.E. (1970). Increasing risk: I a definition. Journal of Economic Theory, 2: 225243.CrossRefGoogle Scholar
22.Schezhtman, E. & Yitzhaki, S. (1987). A measure of association based on Gini's mean difference. Communication in Statistics Theory and Methods 16: 207231.CrossRefGoogle Scholar
23.Shaked, M. & Shanthikumar, J.G. (2007). Stochastic orders. New York: Springer Verlag.CrossRefGoogle Scholar
24.Shannon, C.E. (1948). A mathematical theory of communication. Bell System Technical Journal 27: 379423.CrossRefGoogle Scholar
25.Vontobel, P.O. (2013). The Bethe permanent of a nonnegative matrix. IEEE Transactions on Information Theory 59: 18661901.CrossRefGoogle Scholar
26.Wang, S. (1998). An actuarial index of the right-tail risk. North American Actuarial Journal 2: 88101.CrossRefGoogle Scholar
27.Yang, V. (2012). Study on Cumulative Residual Entropy and Variance as Risk Measure, Fifth International Conference on Business Intelligence and Financial Engineering, published in IEEE, 4 pages.CrossRefGoogle Scholar
28.Yang, J., Xia, W., & Hu, T. (2018). Bounds on extropy with variational distance constraint. Probability in the Engineering and Informational Sciences 33, 119.Google Scholar
29.Yitzhaki, S. (2003). Gini's mean difference: a superior measure of variability for non-normal distributions. Metron 61: 285316.Google Scholar
30.Yitzhaki, S. & Schechtman, E. (2013). The Gini methodology: a primer on a statistical methodology. New York: Springer Verlag.CrossRefGoogle Scholar