Meta-training with Demonstration Retrieval for Efficient Few-shot Learning

1Citations
Citations of this article
14Readers
Mendeley users who have this article in their library.

Abstract

Large language models show impressive results on few-shot NLP tasks. However, these models are memory and computation-intensive. Meta-training allows one to leverage smaller models for few-shot generalization in a domain-general and task-agnostic manner (Min et al., 2022a; Wei et al., 2022; Chen et al., 2022); however, these methods alone results in models that may not have sufficient parameterization or knowledge to adapt quickly to a large variety of tasks. To overcome this issue, we propose meta-training with demonstration retrieval, where we use a dense passage retriever to retrieve semantically similar labeled demonstrations to each example for more varied supervision. By separating external knowledge from model parameters, we can use meta-training to train parameter-efficient models that generalize well on a larger variety of tasks. We construct a meta-training set from UNIFIEDQA and CROSSFIT, and propose a demonstration bank based on UNIFIEDQA tasks. To our knowledge, our work is the first to combine retrieval with meta-training, to use DPR models to retrieve demonstrations, and to leverage demonstrations from many tasks simultaneously, rather than randomly sampling demonstrations from the training set of the target task. Our approach outperforms a variety of targeted parameter-efficient and retrieval-augmented few-shot methods on QA, NLI, and text classification tasks (including SQuAD, QNLI, and TREC). Our approach can be meta-trained and fine-tuned quickly on a single GPU.

Cite

CITATION STYLE

APA

Mueller, A., Narang, K., Mathias, L., Wang, Q., & Firooz, H. (2023). Meta-training with Demonstration Retrieval for Efficient Few-shot Learning. In Proceedings of the Annual Meeting of the Association for Computational Linguistics (pp. 6049–6064). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2023.findings-acl.376

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free