Approximation by log-concave distributions, with applications to regression

58Citations
Citations of this article
30Readers
Mendeley users who have this article in their library.

Abstract

We study the approximation of arbitrary distributions P on d-dimensional space by distributions with log-concave density. Approximation means minimizing a Kullback-Leibler-type functional. We show that such an approximation exists if and only if P has finite first moments and is not supported by some hyperplane. Furthermore we show that this approximation depends continuously on P with respect to Mallows distance D1(·, ·). This result implies consistency of the maximum likelihood estimator of a logconcave density under fairly general conditions. It also allows us to prove existence and consistency of estimators in regression models with a response Y = μ(X) + ε, where X and e are independent, μ(·) belongs to a certain class of regression functions while e is a random error with log-concave density and mean zero. © Institute of Mathematical Statistics, 2011.

Cite

CITATION STYLE

APA

Dümbgen, L., Dominic, R., & Samworth, S. (2011). Approximation by log-concave distributions, with applications to regression. Annals of Statistics, 39(2), 702–730. https://doi.org/10.1214/10-AOS853

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free