Few-Shot Transfer Learning for Text Classification with Lightweight Word Embedding Based Models

42Citations
Citations of this article
53Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Many deep learning architectures have been employed to model the semantic compositionality for text sequences, requiring a huge amount of supervised data for parameters training, making it unfeasible in situations where numerous annotated samples are not available or even do not exist. Different from data-hungry deep models, lightweight word embedding-based models could represent text sequences in a plug-and-play way due to their parameter-free property. In this paper, a modified hierarchical pooling strategy over pre-trained word embeddings is proposed for text classification in a few-shot transfer learning way. The model leverages and transfers knowledge obtained from some source domains to recognize and classify the unseen text sequences with just a handful of support examples in the target problem domain. The extensive experiments on five datasets including both English and Chinese text demonstrate that the simple word embedding-based models (SWEMs) with parameter-free pooling operations are able to abstract and represent the semantic text. The proposed modified hierarchical pooling method exhibits significant classification performance in the few-shot transfer learning tasks compared with other alternative methods.

Cite

CITATION STYLE

APA

Pan, C., Huang, J., Gong, J., & Yuan, X. (2019). Few-Shot Transfer Learning for Text Classification with Lightweight Word Embedding Based Models. IEEE Access, 7, 53296–53304. https://doi.org/10.1109/ACCESS.2019.2911850

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free