Regularisation for Efficient Softmax Parameter Generation in Low-Resource Text Classifiers

0Citations
Citations of this article
5Readers
Mendeley users who have this article in their library.

Abstract

Meta-learning has made tremendous progress in recent years and was demonstrated to be particularly suitable in low-resource settings where training data is very limited. However, meta-learning models still require large amounts of training tasks to achieve good generalisation. Since labelled training data may be sparse, self-supervision-based approaches are able to further improve performance on downstream tasks. Although no labelled data is necessary for this training, a large corpus of unlabelled text needs to be available. In this paper, we improve on recent advances in meta-learning for natural language models that allow training on a diverse set of training tasks for few-shot, low-resource target tasks. We introduce a way to generate new training data with the need for neither more supervised nor unsupervised datasets. We evaluate the method on a diverse set of NLP tasks and show that the model decreases in performance when trained on this data without further adjustments. Therefore, we introduce and evaluate two methods for regularising the training process and show that they not only improve performance when used in conjunction with the new training data but also improve average performance when training only on the original data, compared to the baseline.

Cite

CITATION STYLE

APA

Grießhaber, D., Maucher, J., & Vu, N. T. (2023). Regularisation for Efficient Softmax Parameter Generation in Low-Resource Text Classifiers. In IJCAI International Joint Conference on Artificial Intelligence (Vol. 2023-August, pp. 5058–5066). International Joint Conferences on Artificial Intelligence. https://doi.org/10.24963/ijcai.2023/562

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free