Noise-Adaptive margin-based active learning and lower bounds under tsybakov noise condition

15Citations
Citations of this article
26Readers
Mendeley users who have this article in their library.

Abstract

We present a simple noise-robust margin-based active learning algorithm to find homogeneous (passing the origin) linear separators and analyze its error convergence when labels are corrupted by noise. We show that when the imposed noise satisfies the Tsybakov low noise condition (Mammen, Tsybakov, and others 1999; Tsybakov 2004) the algorithm is able to adapt to unknown level of noise and achieves optimal statistical rate up to polylogarithmic factors. We also derive lower bounds for margin based active learning algorithms under Tsybakov noise conditions (TNC) for the membership query synthesis scenario (Angluin 1988). Our result implies lower bounds for the stream based selective sampling scenario (Cohn 1990) under TNC for some fairly simple data distributions. Quite surprisingly, we show that the sample complexity cannot be improved even if the underlying data distribution is as simple as the uniform distribution on the unit ball. Our proof involves the construction of a wellseparated hypothesis set on the d-dimensional unit ball along with carefully designed label distributions for the Tsybakov noise condition. Our analysis might provide insights for other forms of lower bounds as well.

Cite

CITATION STYLE

APA

Wang, Y., & Singh, A. (2016). Noise-Adaptive margin-based active learning and lower bounds under tsybakov noise condition. In 30th AAAI Conference on Artificial Intelligence, AAAI 2016 (pp. 2180–2186). AAAI press. https://doi.org/10.1609/aaai.v30i1.10206

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free