On the robustness of active learning

3Citations
Citations of this article
13Readers
Mendeley users who have this article in their library.

Abstract

Active Learning is concerned with the question of how to identify the most useful samples for a Machine Learning algorithm to be trained with. When applied correctly, it can be a very powerful tool to counteract the immense data requirements of Artificial Neural Networks. However, we find that it is often applied with not enough care and domain knowledge. As a consequence, unrealistic hopes are raised and transfer of the experimental results from one dataset to another becomes unnecessarily hard. In this work we analyse the robustness of different Active Learning methods with respect to classifier capacity, exchangeability and type, as well as hyperparameters and falsely labelled data. Experiments reveal possible biases towards the architecture used for sample selection, resulting in suboptimal performance for other classifiers. We further propose the new ”Sum of Squared Logits” method based on the Simpson diversity index and investigate the effect of using the confusion matrix for balancing in sample selection.

Cite

CITATION STYLE

APA

Hahn, L., Roese-Koerner, L., Cremer, P., Zimmermann, U., Maoz, O., & Kummert, A. (2019). On the robustness of active learning. In EPiC Series in Computing (Vol. 65, pp. 152–162). EasyChair. https://doi.org/10.29007/thws

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free