We study the approximation of arbitrary distributions P on d-dimensional space by distributions with log-concave density. Approximation means minimizing a Kullback-Leibler-type functional. We show that such an approximation exists if and only if P has finite first moments and is not supported by some hyperplane. Furthermore we show that this approximation depends continuously on P with respect to Mallows distance D1(·, ·). This result implies consistency of the maximum likelihood estimator of a logconcave density under fairly general conditions. It also allows us to prove existence and consistency of estimators in regression models with a response Y = μ(X) + ε, where X and e are independent, μ(·) belongs to a certain class of regression functions while e is a random error with log-concave density and mean zero. © Institute of Mathematical Statistics, 2011.
CITATION STYLE
Dümbgen, L., Dominic, R., & Samworth, S. (2011). Approximation by log-concave distributions, with applications to regression. Annals of Statistics, 39(2), 702–730. https://doi.org/10.1214/10-AOS853
Mendeley helps you to discover research relevant for your work.