Localization of VC classes: Beyond local Rademacher complexities

2Citations
Citations of this article
4Readers
Mendeley users who have this article in their library.
Get full text

Abstract

In statistical learning the excess risk of empirical risk minimization (ERM) is controlled by (Formula Presented), where n is a size of a learning sample, COMPn(F) is a complexity term associated with a given class F and α ∈ [1/2,1] interpolates between slow and fast learning rates. In this paper we introduce an alternative localization approach for binary classification that leads to a novel complexity measure: fixed points of the local empirical entropy. We show that this complexity measure gives a tight control over COMPn(F) in the upper bounds under bounded noise. Our results are accompanied by a novel minimax lower bound that involves the same quantity. In particular, we practically answer the question of optimality of ERM under bounded noise for general VC classes.

Cite

CITATION STYLE

APA

Zhivotovskiy, N., & Hanneke, S. (2016). Localization of VC classes: Beyond local Rademacher complexities. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 9925 LNAI, pp. 18–33). Springer Verlag. https://doi.org/10.1007/978-3-319-46379-7_2

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free