Access the full text.
Sign up today, get DeepDyve free for 14 days.
C. Klaassen (1985)
On an Inequality of ChernoffAnnals of Probability, 13
Robert Ackermann (2003)
Statistical InferenceTechnometrics, 45
(1982)
A proofofthe CentralLim itTheorem m otivated by the Cram er - Rao inequality
A. Holevo (2003)
Asymptotic Estimation of a Shift Parameter of a Quantum StateTheory of Probability and Its Applications, 49
L. Gross (1975)
LOGARITHMIC SOBOLEV INEQUALITIES.American Journal of Mathematics, 97
W. Hoeffding, B. Gnedenko, A. Kolmogorov, K. Chung, J. Doob (1955)
Limit Distributions for Sums of Independent Random Variables
O. Johnson (2000)
Entropy inequalities and the Central Limit TheoremStochastic Processes and their Applications, 88
H. Chernoff (1981)
A Note on an Inequality Involving the Normal DistributionAnnals of Probability, 9
V. Fabian, J. Hannan (1977)
On the Cramer-Rao InequalityAnnals of Statistics, 5
J. Linnik (1959)
An Information-Theoretic Proof of the Central Limit Theorem with Lindeberg ConditionsTheory of Probability and Its Applications, 4
O. Johnson, Y. Suhov (2001)
Entropy and Random VectorsJournal of Statistical Physics, 104
N. Blachman (1965)
The convolution inequality for entropy powersIEEE Trans. Inf. Theory, 11
E. Ziegel, E. Lehmann, G. Casella (1950)
Theory of point estimation
J. Nash (1958)
Continuity of Solutions of Parabolic and Elliptic EquationsAmerican Journal of Mathematics, 80
R. Shimizu (1975)
On Fisher’s Amount of Information for Location Family
(1998)
Theory of point estim ation. SpringerTextsin Statistics.Springer-Verlag,New York,second edition
A. Stam (1959)
Some Inequalities Satisfied by the Quantities of Information of Fisher and ShannonInf. Control., 2
T. Cacoullos (1982)
On Upper and Lower Bounds for the Variance of a Function of a Random VariableAnnals of Probability, 10
L. Brown, L. Gajek (1990)
Information Inequalities for the Bayes RiskAnnals of Statistics, 18
(1989)
Editorial addressesJournal of Optimization Theory and Applications, 61
K. Ball, F. Barthe, A. Naor (2003)
Entropy jumps in the presence of a spectral gapDuke Mathematical Journal, 119
A. Barron (1986)
ENTROPY AND THE CENTRAL LIMIT THEOREMAnnals of Probability, 14
A. Borovkov, S. Utev (1984)
On an Inequality and a Related Characterization of the Normal DistributionTheory of Probability and Its Applications, 28
We give conditions for an O(1/n) rate of convergence of Fisher information and relative entropy in the Central Limit Theorem. We use the theory of projections in L 2 spaces and Poincaré inequalities, to provide a better understanding of the decrease in Fisher information implied by results of Barron and Brown. We show that if the standardized Fisher information ever becomes finite then it converges to zero.
Probability Theory and Related Fields – Springer Journals
Published: Apr 29, 2004
Read and print from thousands of top scholarly journals.
Already have an account? Log in
Bookmark this article. You can see your Bookmarks on your DeepDyve Library.
To save an article, log in first, or sign up for a DeepDyve account if you don’t already have one.
Copy and paste the desired citation format or use the link below to download a file formatted for EndNote
Access the full text.
Sign up today, get DeepDyve free for 14 days.
All DeepDyve websites use cookies to improve your online experience. They were placed on your computer when you launched this website. You can change your cookie settings through your browser.