Extended Margin and Soft Balanced Strategies in Active Learning

3Citations
Citations of this article
3Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Nowadays active learning is gaining increasing interest in computer vision community, especially on images. The most commonly used query strategy framework is uncertainty sampling usually in a pool-based sampling scenario. In this paper we propose two query strategies for image classification under the uncertainty sampling framework, both of them being improvements of existing techniques. The first strategy, so called Extended Margin incorporates all possible class labels to calculate the informativeness values of unlabeled instances. The second strategy is the improvement of the recently published BAL method, so called Soft Balanced approach, where we suggest new final informativeness score from an uncertainty measure and a novel penalty metric. We used least margin criterion for the former and the latter was calculated from the categorical penalty scores by using soft assignment. We conducted experiments on 60 different test image sets, each of them was a randomly selected subset of the Caltech101 image collection. The experiments were performed in an extended active learning environment and the results showed that the Extended Margin outperforms the least margin approach and the Soft Balanced method overcomes all other competitor method.

Cite

CITATION STYLE

APA

Papp, D., & Szűcs, G. (2018). Extended Margin and Soft Balanced Strategies in Active Learning. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 11019 LNCS, pp. 69–81). Springer Verlag. https://doi.org/10.1007/978-3-319-98398-1_5

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free