Estimation bounds and sharp oracle inequalities of regularized procedures with lipschitz loss functions

25Citations
Citations of this article
18Readers
Mendeley users who have this article in their library.

Abstract

We obtain estimation error rates and sharp oracle inequalities for regularization procedures of the form f ϵ argmin fϵF (1/NσNi=1ℓf(Xi,Yi)+λf) when. is any norm, F is a convex class of functions and is a Lipschitz loss function satisfying a Bernstein condition over F. We explore both the bounded and sub-Gaussian stochastic frameworks for the distribution of the f (Xi)'s, with no assumption on the distribution of the Yi's. The general results rely on two main objects: a complexity function and a sparsity equation, that depend on the specific setting in hand (loss and norm•). As a proof of concept, we obtain minimax rates of convergence in the following problems: (1) matrix completion with any Lipschitz loss function, including the hinge and logistic loss for the so-called 1-bit matrix completion instance of the problem, and quantile losses for the general case, which enables to estimate any quantile on the entries of the matrix; (2) logistic LASSO and variants such as the logistic SLOPE, and also shape constrained logistic regression; (3) kernel methods, where the loss is the hinge loss, and the regularization function is the RKHS norm. © 2019 Institute of Mathematical Statistics.

Cite

CITATION STYLE

APA

Alquier, P., Cottet, V., & Lecué, G. (2019). Estimation bounds and sharp oracle inequalities of regularized procedures with lipschitz loss functions. Annals of Statistics, 47(4), 2117–2144. https://doi.org/10.1214/18-AOS1742

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free