Generation-distillation for efficient natural language understanding in low-data settings

6Citations
Citations of this article
86Readers
Mendeley users who have this article in their library.

Abstract

Over the past year, the emergence of transfer learning with large-scale language models (LM) has led to dramatic performance improvements across a broad range of natural language understanding tasks. However, the size and memory footprint of these large LMs makes them difficult to deploy in many scenarios (e.g. on mobile phones). Recent research points to knowledge distillation as a potential solution, showing that when training data for a given task is abundant, it is possible to distill a large (teacher) LM into a small task-specific (student) network with minimal loss of performance. However, when such data is scarce, there remains a significant performance gap between large pretrained LMs and smaller task-specific models, even when training via distillation. In this paper, we bridge this gap with a novel training approach, called generation-distillation, that leverages large finetuned LMs in two ways: (1) to generate new (unlabeled) training examples, and (2) to distill their knowledge into a small network using these examples. Across three low-resource text classification datsets, we achieve comparable performance to BERT while using 300× fewer parameters, and we outperform prior approaches to distillation for text classification while using 3× fewer parameters.

Cite

CITATION STYLE

APA

Melas-Kyriazi, L., Han, G., & Liang, C. (2021). Generation-distillation for efficient natural language understanding in low-data settings. In DeepLo@EMNLP-IJCNLP 2019 - Proceedings of the 2nd Workshop on Deep Learning Approaches for Low-Resource Natural Language Processing - Proceedings (pp. 124–131). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/d19-6114

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free