Complexity regularization via localized random penalties

33Citations
Citations of this article
25Readers
Mendeley users who have this article in their library.

Abstract

In this article, model selection via penalized empirical loss minimization in nonparametric classification problems is studied. Data-dependent penalties are constructed, which are based on estimates of the complexity of a small subclass of each model class, containing only those functions with small empirical loss. The penalties are novel since those considered in the literature are typically based on the entire model class. Oracle inequalities using these penalties are established, and the advantage of the new penalties over those based on the complexity of the whole model class is demonstrated. © Institute of Mathematical Statistics, 2004.

Cite

CITATION STYLE

APA

Lugosi, G., & Wegkamp, M. (2004). Complexity regularization via localized random penalties. Annals of Statistics, 32(4), 1679–1697. https://doi.org/10.1214/009053604000000463

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free