A Multitask Active Learning Framework for Natural Language Understanding

3Citations
Citations of this article
60Readers
Mendeley users who have this article in their library.

Abstract

Natural language understanding (NLU) aims at identifying user intent and extracting semantic slots. This requires sufficient annotating data to get considerable performance in real-world situations. Active learning (AL) has been well-studied to decrease the needed amount of the annotating data and successfully applied to NLU. However, no research has been done on investigating how the relation information between intents and slots can improve the efficiency of AL algorithms. In this paper, we propose a multitask AL framework for NLU. Our framework enables pool-based AL algorithms to make use of the relation information between sub-tasks provided by a joint model, and we propose an efficient computation for the entropy of a joint model. Experimental results show our framework can achieve competitive performance with less training data than baseline methods on all datasets. We also demonstrate that when using the entropy as the query strategy, the model with complete relation information can perform better than those with partial information. Additionally, we demonstrate that the efficiency of these active learning algorithms in our framework is still effective when incorporate with the Bidirectional Encoder Representations from Transformers (BERT).

Cite

CITATION STYLE

APA

Zhu, H., Ye, W., Luo, S., & Zhang, X. (2020). A Multitask Active Learning Framework for Natural Language Understanding. In COLING 2020 - 28th International Conference on Computational Linguistics, Proceedings of the Conference (pp. 4900–4914). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2020.coling-main.430

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free