Hostname: page-component-78c5997874-mlc7c Total loading time: 0 Render date: 2024-11-15T07:21:02.708Z Has data issue: false hasContentIssue false

Concentration functions and entropy bounds for discrete log-concave distributions

Published online by Cambridge University Press:  27 May 2021

Sergey G. Bobkov
Affiliation:
School of Mathematics, University of Minnesota, Minneapolis, MN55455, USA
Arnaud Marsiglietti*
Affiliation:
University of Florida, Department of Mathematics, Gainesville, FL32611, USA
James Melbourne
Affiliation:
Department of Electrical and Computer Engineering, University of Minnesota, Minneapolis, MN55455, USA
*
*Corresponding author. Email: [email protected]

Abstract

Two-sided bounds are explored for concentration functions and Rényi entropies in the class of discrete log-concave probability distributions. They are used to derive certain variants of the entropy power inequalities.

Type
Paper
Copyright
© The Author(s), 2021. Published by Cambridge University Press

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

Baillon, J.-B., Cominetti, R. and Vaisman, J. (2016) A sharp uniform bound for the distribution of sums of Bernoulli trials. Combin. Probab. Comput. 25 352361.CrossRefGoogle Scholar
Bobkov, S. G. and Chistyakov, G. P. (2012) Bounds for the maximum of the density of the sum of independent random variables. (Russian) Zap. Nauchn. Sem. S.-Peterburg. Otdel. Mat. Inst. Steklov. (POMI) 408, Veroyatnost i Statistika 18 62–73, 324; translation in J. Math. Sci. (N.Y.) 199 (2014) 100106.Google Scholar
Bobkov, S. G. and Chistyakov, G. P. (2015) Entropy power inequality for the Rényi entropy. IEEE Trans. Inform. Theory 61 708714.Google Scholar
Bobkov, S. G. and Chistyakov, G. P. (2015) On concentration functions of random variables. J. Theor. Probab. 28 976988.CrossRefGoogle Scholar
Bobkov, S. G. and Marsiglietti, A. Variants of the entropy power inequality. IEEE Trans. Inform. Theory 63 77477752.Google Scholar
Costa, J., Hero, A. and Vignat, C. (2003) On solutions to multivariate maximum α-entropy problems. In International Workshop on Energy Minimization Methods in Computer Vision and Pattern Recognition, EMMCVPR 2003, pp. 211–226.Google Scholar
Johnson, O. and Goldschmidt, C. (2006) Preservation of log-concavity on summation. ESAIM Probab. Stat. 10 206215 (electronic).Google Scholar
Johnson, O. and Yu, Y. (2010) Monotonicity, thinning, and discrete versions of the entropy power inequality. IEEE Trans. Inform. Theory 56 53875395.CrossRefGoogle Scholar
Haghighatshoar, S., Abbe, E. and Telatar, I. E. (2014) A new entropy power inequality for integer-valued random variables. IEEE Trans. Inform. Theory 60 37873796.Google Scholar
Harremoës, P. and Vignat, V. (2003) An entropy power inequality for the binomial family. J. Inequal. Pure Appl. Math. 4 115.Google Scholar
Hoggar, S. G. (1974) Chromatic polynomials and logarithmic concavity. J. Combin. Theory Ser. B 16 248254.CrossRefGoogle Scholar
Kapur, J. N. (1988) Generalised Cauchy and Student’s distributions as maximum-entropy distributions. Proc. Nat. Acad. Sci. India Sect. A 58 235–246.Google Scholar
Li, J. (2018) Rényi entropy power inequality and a reverse. Studia Math. 242 303319.CrossRefGoogle Scholar
Li, J., Marsiglietti, A. and Melbourne, J. (2020) Further investigations of Rényi entropy power inequalities and an entropic characterization of s-concave densities. In Geometric Aspects of Functional Analysis: GAFA Israel Seminar (2017–2019), vol. of Lecture Notes in Mathematics (Klartag, B. and Milman, E., eds), Springer.Google Scholar
Madiman, M. and Barron, A. R. (2007) Generalized entropy power inequalities and monotonicity properties of information. IEEE Trans. Inform. Theory 53 23172329.CrossRefGoogle Scholar
Madiman, M., Melbourne, J. and Xu, P. (2017) Rogozin’s convolution inequality for locally compact groups. arXiv:1705.00642Google Scholar
Madiman, M., Wang, L. and Woo, J. O. (2019) Majorization and Rényi entropy inequalities via sperner theory. Disc. Math. (AEGT 2017 Special issue edited by S. Cioaba, R. Coulter, E. Fiorini, Q. Xiang, F. Pfender) 342 29112923.Google Scholar
Marsiglietti, A. and Melbourne, J. (2019) On the entropy power inequality for the Rényi entropy of order [0,1]. IEEE Trans. Inform. Theory 65 13871396.Google Scholar
Melbourne, J. and Tkocz, T. (2020) On the Rényi entropy of log-concave sequences. In IEEE International Symposium on Information Theory (ISIT), pp. 2292–2296.CrossRefGoogle Scholar
Melbourne, J. and Tkocz, T. (2020) Reversal of Rényi entropy inequalities under log-concavity. IEEE Trans. Inform. Theory 67 4551.CrossRefGoogle Scholar
Madiman, M., Melbourne, J. and Roberto, C. (2020) Bernoulli sums and Rényi entropy inequalities. Preprint.Google Scholar
Moriguti, S. (1952) A lower bound for a probability moment of any absolutely continuous distribution with finite variance. Ann. Math. Statistics 23 286289.Google Scholar
Petrov, V. V. (1975) Sums of independent random variables. In Ergebnisse der Mathematik und ihrer Grenzgebiete , Band 82 (Translated from the Russian by A. A. Brown) Springer-Verlag, New York-Heidelberg, x+346 pp. Russian ed.: Moscow, Nauka, 1972, 414 pp.Google Scholar
Petrov, V. V. (1987) Limit Theorems for Sums of Independent Random Variables. (Russian) Moscow, Nauka, 320 pp.Google Scholar
Ram, E. and Sason, I. (2016) On Rényi Entropy Power Inequalities. IEEE Trans. Inform. Theory 62 68006815.CrossRefGoogle Scholar
Stanley, R. (1989) Log-concave and unimodal sequences in algebra, combinatorics, and geometry. Ann. New York Acad. Sci. 576 500535.CrossRefGoogle Scholar