Hostname: page-component-cd9895bd7-gvvz8 Total loading time: 0 Render date: 2024-12-26T06:32:53.818Z Has data issue: false hasContentIssue false

A DE BRUIJN'S IDENTITY FOR DEPENDENT RANDOM VARIABLES BASED ON COPULA THEORY

Published online by Cambridge University Press:  14 October 2015

Nayereh Bagheri Khoolenjani
Affiliation:
Department of Statistics, University of Isfahan, Isfahan 81746-73441, Iran E-mail: [email protected]; [email protected].
Mohammad Hossein Alamatsaz
Affiliation:
Department of Statistics, University of Isfahan, Isfahan 81746-73441, Iran E-mail: [email protected]; [email protected].

Abstract

De Bruijn's identity shows a link between two fundamental concepts in information theory: entropy and Fisher information. In the literature, De Bruijn's identity has been stated under the assumption of independence between input signal and an additive noise. However, in the real world, the noise could be highly dependent on the main signal. The main aim of this paper is, firstly, to extend De bruijn's identity for signal-dependent noise channels and, secondly, to study how Stein and heat identities are related to De bruijn's identity. Thus, new versions of De Bruijn's identity are introduced in the case when input signal and additive noise are dependent and are jointly distributed according to Archimedean and Gaussian copulas. It is shown that in this generalized model, the derivatives of the differential entropy can be expressed in terms of a function of Fisher information. Our finding enfolds the conventional De Bruijn's identity as some remarks. Then, the equivalence among the new De Bruijn-type identity, Stein's identity and heat equation identity is established. The paper concludes with an application of the developed results in information theory.

Type
Research Article
Copyright
Copyright © Cambridge University Press 2015 

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

1.Acito, N., Diani, M. & Corsini, G. (2011). Signal-dependent noise modeling and model parameter estimation in hyperspectral images. IEEE Transactions on Geoscience and Remote Sensing 49(8): 29572971.CrossRefGoogle Scholar
2.Aiazzi, B., Alparone, B., Baronti, S. & Garzelli, A. (2003). Coherence estimation from multilook incoherent SAR imagery. IEEE Transactions on Geoscience and Remote Sensing 41(11): 25312539.CrossRefGoogle Scholar
3.Arias-Nicolas, J.P., Fernandez-Ponce, J.M., Luque-Calvo, P. & Suarez-Llorens, A. (2005). Multivariate dispersion order and the notion of copula applied to the multivariate t-distribution. Probability in the Engineering and Informational Sciences 19: 363375.CrossRefGoogle Scholar
4.Bergmans, P.P. (1974). A simple converse for broadcast channels with additive white Gaussian noise, IEEE Transactions on Information Theory IT-20: 279280.CrossRefGoogle Scholar
5.Brown, L., DasGupta, A., Haff, L.R. & Strawderman, W.E. (2006). The heat equation and Stein's identity: Connections, applications. Journal of Statistical Planning and Inference 136: 22542278.CrossRefGoogle Scholar
6.Costa, M.H. (1985). A new entropy power inequality. IEEE Transactions on Information Theory 31(6): 751760.CrossRefGoogle Scholar
7.Cover, T.M. & Thomas, J.A. (2006). Elements of Information Theory. 2nd edn., New York: Wiley.Google Scholar
8.Gholizadeh, M.H., Amindavar, H. & Ritcey, J.A. (2013). Analytic Nakagami fading parameter estimation in dependent noise channel using copula. EURASIP Journal on Advances in Signal Processing 2013(1):129.CrossRefGoogle Scholar
9.Guo, D., Shamai (Shitz), S. & Verdu, S., (2005). Mutual information and minimum mean-square error in Gaussian channels. IEEE Transactions on Information Theory 51(4): 12611282.CrossRefGoogle Scholar
10.Joe, H. (1997). Multivariate Models and Dependence Concepts.In Monographs on Statistics and Applied Probability. vol. 73. London: Chapman & Hall.Google Scholar
11.Johnson, O. (2004). Information Theory and the Central Limit Theorem. London, UK: Imperial College Press.CrossRefGoogle Scholar
12.Kattumannil, S.K. (2009). On Stein's identity and its applications. Statistics and Probability Letters 79: 14441449.CrossRefGoogle Scholar
13.Kay, S. (2009). Waveform design for multistatic radar detection. IEEE Transactions on Aerospace and Electronic Systems 45(3): 11531166.CrossRefGoogle Scholar
14.Lehmann, E.L. & Casella, G. (1998). Theory of Point Estimation. 2nd edn., New York: Springer.Google Scholar
15.Li, X. & Zhang, S. (2011). Some new results on Renyi entropy of residual life and inactivity time. Probability in the Engineering and Informational Sciences 25: 237250.CrossRefGoogle Scholar
16.Liu, T. & Viswanath, P. (2007). An extremal inequality motivated by multiterminal information-theoretic problems. IEEE Transactions on Information Theory 53(5): 18391851.CrossRefGoogle Scholar
17.Moon, J. & Park, J. (2001). Pattern-dependent noise prediction in signal-dependent noise. IEEE Journal on Selected Areas in Communications 19(4): 730743.CrossRefGoogle Scholar
18.Nelsen, R.B. (2006). An Introduction to Copulas. 2nd edn., New York: Springer.Google Scholar
19.Palomar D.P. & Verdu, S. (2006). Gradient of mutual information in linear vector Gaussian channels. IEEE Transactions on Information Theory 52(1): 141154.Google Scholar
20.Park, S., Serpedin, E. & Qaraqe, K. (2012). On the equivalence between stein and De Bruijn identities. IEEE Transactions on Information Theory 58(12): 70457067.CrossRefGoogle Scholar
21.Payaro, M. & Palomar, D.P. (2009). Hessian and concavity of mutual information, differential entropy, and entropy power in linear vector Gaussian channels. IEEE Transactions on Information Theory 55(8): 36133628.CrossRefGoogle Scholar
22.Rioul, O. (2011). Information theoretic proofs of entropy power inequalities. IEEE Transactions on Information Theory 57(1): 3355.CrossRefGoogle Scholar
23.Shannon, C.E. (1948). A mathematical theory of communication. Bell System Technical Journal 27: 379423, 623–656.CrossRefGoogle Scholar
24.Sklar, A. (1959). Fonctions de repartition an dimensions et leurs marges. Publications de l'Institut de Statistique de l'Université de Paris 8: 229231.Google Scholar
25.Stam, A.J. (1959). Some inequalities satisfied by the quantities of information of Fisher and Shannon. Information and Control 2(2): 101112.CrossRefGoogle Scholar
26.Weingarten, H., Steinberg, Y. & Shamai, S. (2006). The capacity region of the Gaussian multiple-input multiple-output broadcast channel. IEEE Transactions on Information Theory 52(9): 39363964.CrossRefGoogle Scholar