The true sample complexity of active learning

73Citations
Citations of this article
83Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

We describe and explore a new perspective on the sample complexity of active learning. In many situations where it was generally believed that active learning does not help, we show that active learning does help in the limit, often with exponential improvements in sample complexity. This contrasts with the traditional analysis of active learning problems such as non-homogeneous linear separators or depth-limited decision trees, in which Ω(1/ε) lower bounds are common. Such lower bounds should be interpreted carefully; indeed, we prove that it is always possible to learn an ε-good classifier with a number of samples asymptotically smaller than this. These new insights arise from a subtle variation on the traditional definition of sample complexity, not previously recognized in the active learning literature. © 2010 The Author(s).

Cite

CITATION STYLE

APA

Balcan, M. F., Hanneke, S., & Vaughan, J. W. (2010). The true sample complexity of active learning. Machine Learning, 80(2–3), 111–139. https://doi.org/10.1007/s10994-010-5174-y

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free