Attention based joint model with negative sampling for new slot values recognition

0Citations
Citations of this article
6Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Natural Language Understanding (NLU) is an important component of a task oriented dialogue system, which obtains slot values in user utterances. NLU module is often required to return standard slot values and recognize new slot values at the same time in many real world dialogue such as restaurant booking. Neither previous sequence labeling models nor classifiers can satisfy both requirements by themselves. To address the problem, the paper proposes an attention based joint model with negative sampling. It combines a sequence tagger with a classifier by an attention mechanism. The tagger helps in identifying slot values in raw texts and the classifier simultaneously maps them into standard slot values or the symbol of new values. Negative sampling is used for constructing negative samples of existing values to train the model. Experimental results on two datasets show that our model outperforms the previous methods. The negative samples contribute to new slot values identification, and the attention mechanism discovers important information and boosts the performance.

Cite

CITATION STYLE

APA

Hou, M., Wang, X., Yuan, C., Yang, G., Hu, S., & Shi, Y. (2019). Attention based joint model with negative sampling for new slot values recognition. In Lecture Notes in Electrical Engineering (Vol. 579, pp. 3–15). Springer. https://doi.org/10.1007/978-981-13-9443-0_1

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free