Towards Label-Free Few-Shot Learning: How Far Can We Go?

0Citations
Citations of this article
4Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Few-shot learners aim to recognize new categories given only a small number of training samples. The core challenge is to avoid overfitting to the limited data while ensuring good generalization to novel classes. Existing literature makes use of vast amounts of annotated data by simply shifting the label requirement from novel classes to base classes. Since data annotation is time-consuming and costly, reducing the label requirement even further is an important goal. To that end, our paper presents a more challenging few-shot setting with almost no class label access. By leveraging self-supervision to learn image representations and similarity for classification at test time, we achieve competitive baselines while using almost zero (0–5) class labels. Compared to existing state-of-the-art approaches which use 60,000 labels, this is a four orders of magnitude (10,000 times) difference. This work is a step towards developing few-shot learning methods that do not depend on annotated data. Our code is publicly released at https://github.com/adbugger/FewShot. (This work was supported by the IMPRINT program.)

Author supplied keywords

Cite

CITATION STYLE

APA

Bharti, A., Vineeth, N. B., & Jawahar, C. V. (2022). Towards Label-Free Few-Shot Learning: How Far Can We Go? In Communications in Computer and Information Science (Vol. 1567 CCIS, pp. 256–268). Springer Science and Business Media Deutschland GmbH. https://doi.org/10.1007/978-3-031-11346-8_23

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free