Generally Boosting Few-Shot Learning with HandCrafted Features

3Citations
Citations of this article
9Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Existing Few-Shot Learning (FSL) methods predominantly focus on developing different types of sophisticated models to extract the transferable prior knowledge for recognizing novel classes, while they almost pay less attention to the feature learning part in FSL which often simply leverage some well-known CNN as the feature learner. However, feature is the core medium for encoding such transferable knowledge. Feature learning is easy to be trapped in the over-fitting particularly in the scarcity of the training data, and thereby degenerates the performances of FSL. The handcrafted features, such as Histogram of Oriented Gradient (HOG) and Local Binary Pattern (LBP), have no requirement on the amount of training data, and used to perform quite well in many small-scale data scenarios, since their extractions involve no learning process, and are mainly based on the empirically observed and summarized prior feature engineering knowledge. In this paper, we intend to develop a general and simple approach for generally boosting FSL via exploiting such prior knowledge in the feature learning phase. To this end, we introduce two novel handcrafted feature regression modules, namely HOG and LBP regression, to the feature learning parts of deep learning-based FSL models. These two modules are separately plugged into the different convolutional layers of backbone based on the characteristics of the corresponding handcrafted features to guide the backbone optimization from different feature granularity, and also ensure that the learned feature can encode the handcrafted feature knowledge which improves the generalization ability of feature and alleviate the over-fitting of the models. Three recent state-of-the-art FSL approaches are leveraged for examining the effectiveness of our method. Extensive experiments on miniImageNet, CIFAR-FS and FC100 datasets show that the performances of all these FSL approaches are well boosted via applying our method on all three datasets. Our codes and models have been released.

Cite

CITATION STYLE

APA

Zhang, Y., Huang, S., & Zhou, F. (2021). Generally Boosting Few-Shot Learning with HandCrafted Features. In MM 2021 - Proceedings of the 29th ACM International Conference on Multimedia (pp. 3143–3152). Association for Computing Machinery, Inc. https://doi.org/10.1145/3474085.3475459

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free