Articles

RATE OF CONVERGENCE AND EXPANSION OF RÉNYI ENTROPIC CENTRAL LIMIT THEOREM

Expand
  • Wuhan Institute of Physics and Mathematics, Chinese Academy of Sciences and University ofChinese Academy of Sciences, Wuhan 430071, China Wuhan Institute of Physics and Mathematics, Chinese Academy of Sciences, Wuhan 430071, China National Center for Mathematics and Interdisciplinary Sciences, Chinese Academy of Sciences, Beijing 100190, China

Received date: 2014-02-27

  Revised date: 2014-10-13

  Online published: 2015-01-20

Supported by

This work is supported by National Basic Research Program of China (973 Program) (2011CB707802, 2013CB910200) and Natural Science Foundation of China Grant (11126180).

Abstract

We obtain the expansion of R´enyi divergence of order α (0 <α < 1) between the normalized sum of IID continuous random variables and the Gaussian limit under minimal moment conditions via Edgeworth-type expansion. The rate is faster than that of Shannon case, which can be used to improve the rate of convergence in total variance norm.

Cite this article

SUN Jian Qiang, DING Yi Ming . RATE OF CONVERGENCE AND EXPANSION OF RÉNYI ENTROPIC CENTRAL LIMIT THEOREM[J]. Acta mathematica scientia, Series B, 2015 , 35(1) : 79 -88 . DOI: 10.1016/S0252-9602(14)60140-5

References

[1] Artstein S, Ball K M, Barthe F, et al. Solution of Shannon’s problem on the monotonicity of entropy. J
Amer Math Soc, 2004, 17: 975–982
[2] Artstein S, Ball K M, Barthe F, et al. On the rate of convergence in the entropic central limit theorem.
Probab Theory Relat Fields, 2004, 129: 381–390
[3] Aubrun G, Szarek S, Werner E. Nonadditivity of R´enyi entropy and Dvoretzky’s theorem. J Math Phys,
2001, 51: 022102
[4] Barron A R. Entropy and the central limit theorem. Ann Probab, 1986, 14: 336–342
[5] Bhattacharya R N, Ranga Rao R. Normal Approximation and Asymptotic Expansions. New York: John
Wiley & Sons, Inc, 1976
[6] Bobkov S G, Chistyakov G P, G¨otze F. Rate of convergence and edgeworth-type expansion in the entropic
central limit theorem. Ann Prob, 2013, 31(4): 2479–2512
[7] Erven V T. Harrenmo¨es P. R´enyi divergence and Kullback-Leibler divergence. 2012, arXiv: 1206.2459
[8] Johnson O. In Formation Theory and the Central Limit Theorem. Imperical College Press, 2004
[9] Johnson O, Barron A. Fisher Information inequalities and the central limit theorem. Probab Theory Relat
Fields, 2004, 129: 391–409
[10] Johnson O, Vignat C. Some results concerning maximum R´enyi entropy distributions. Ann Inst H Poincar
R´enyi Probab Statist, 2007, 43: 339–351
[11] Linnik J V. An information theoretic proof of the central limit theorem with Lindeberg conditions. Theory
Probab Appl, 1959, 4: 288–299
[12] Lutwak E, Yang D, Zhang G. Cramer-Rao and moment-entropy inequalities for R´enyi entropy and generalized
Fisher information. IEEE Trans Inform Theory, 2005, 51: 473–478
[13] Madiman M, Barron A R. Generalized entropy power inequalities and monotonicity properties of information.
IEEE Transactions on Information Theory, 2007, 53: 2317–2329
[14] Petrov V V. Sums of Independent Random Variables. Springer-Verlag, 1975: 206–206
[15] R´enyi A. On measures of information and entropy//Proceedings of the 4th Berkeley Symposium on Mathematics,
Statistics and Probability. 1960: 547–561
[16] Shannon C E, Weaver W W. A Mathematical Theory of Communication. Urbana, IL: University of Illinois
Press, 1949
[17] Tulino A M, Verdu S. Monotonic decrease of the non-Gaussianness of the sum of independent random
variables: a simple proof. IEEE Trans Information Theory, 2006, 52: 4295–4297

Outlines

/