Access the full text.
Sign up today, get DeepDyve free for 14 days.
Richard Nickl, B. Pötscher (2007)
Bracketing Metric Entropy Rates and Empirical Central Limit Theorems for Function Classes of Besov- and Sobolev-TypeJournal of Theoretical Probability, 20
英敦 塚原 (2009)
Aad W. van der Vaart and Jon A. Wellner: Weak Convergence and Empirical Processes: With Applications to Statistics, Springer,1996年,xvi + 508ページ., 61
J. Doob (1953)
Stochastic processes
Fuchang Gao, J. Wellner (2012)
Global rates of convergence of the MLE for multivariate interval censoring.Electronic journal of statistics, 7
Fuchang Gao (2008)
Entropy Estimate for $k$-Monotone Functions via Small Ball Probability of Integrated Brownian MotionsElectronic Communications in Probability, 13
D. Bilyk, M. Lacey, A. Vagharshakyan (2007)
On the Small Ball Inequality in All DimensionsarXiv: Classical Analysis and ODEs
Fuchang Gao, Wenbo Li, J. Wellner (2010)
How many Laplace transforms of probability measures are there?Proceedings of the American Mathematical Society. American Mathematical Society, 138 12
T. Dunker, W. Linde, T. Kühn, M. Lifshits (1999)
Metric Entropy of Integration Operators and Small Ball Probabilities for the Brownian SheetJournal of Approximation Theory, 101
Fuchang Gao, J. Wellner (2005)
Entropy estimate for high-dimensional monotonic functionsJournal of Multivariate Analysis, 98
William Chen (1980)
On irregularities of distribution.Mathematika, 27
A. Vaart (1994)
Bracketing smooth functionsStochastic Processes and their Applications, 52
Fuchang Gao, J. Wellner (2009)
On the rate of convergence of the maximum likelihood estimator of a k-monotone densityScience in China Series A: Mathematics, 52
T. Mikosch, A. Vaart, J. Wellner (1996)
Weak Convergence and Empirical Processes: With Applications to Statistics
J. Kuelbs, Wenbo Li (1993)
Metric entropy and the small ball problem for Gaussian measuresJournal of Functional Analysis, 116
R. Blei, Fuchang Gao, Wenbo Li (2007)
Metric entropy of high dimensional distributions, 135
[Let \documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document} $$\mathcal{F}_{d}$$ \end{document} be the class of probability distribution functions on \documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document} $$[0,\,1]^{d},\,{d}\geq{2}$$ \end{document}. The following estimate for the bracketing entropy of \documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document} $$\mathcal{F}_{d}$$ \end{document} in the \documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document} $$[L]^{p}$$\end{document} norm, \documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document} $$1\,\leq\,p\,{<} \infty $$ \end{document}, is obtained: \documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document} $${\rm log}{N_{[\,]}}(\varepsilon, \mathcal{F}_{d},{\parallel.\parallel}_p)=O(\varepsilon^{-1}{|\rm log\varepsilon|^{2(\rm d-1)}}).$$ \end{document} Based on this estimate, a general relation between bracketing entropy in the Lp norm and metric entropy in the L1 norm for multivariate smooth functions is established.]
Published: Apr 1, 2013
Keywords: Bracketing entropy; metric entropy; high dimensional distribution
Read and print from thousands of top scholarly journals.
Already have an account? Log in
Bookmark this article. You can see your Bookmarks on your DeepDyve Library.
To save an article, log in first, or sign up for a DeepDyve account if you don’t already have one.
Copy and paste the desired citation format or use the link below to download a file formatted for EndNote
Access the full text.
Sign up today, get DeepDyve free for 14 days.
All DeepDyve websites use cookies to improve your online experience. They were placed on your computer when you launched this website. You can change your cookie settings through your browser.