Natural language understanding (NLU) technologies for human-computer conversation is becoming a hot topic in the Internet of Things (IoT). Intent detection and slot filling are two fundamental NLU subtasks. Current approaches to these two subtasks include joint training methods and pipeline methods. Whether treating intent detection and slot filling as two separate tasks or training the two tasks as a joint model utilizing neural networks, most methods fail to build a complete correlation between the intent and slots. Some studies indicate that the intent and slots have a strong relationship because slots often highly depend on intent and also give clues to intent. Thus, recent joint models connect the two subtasks by sharing an intermediate network representation, but we argue that precise label information from one task is more helpful in improving the performance of another task. It is difficult to achieve complete information interaction between intent and slots because the extracted features in existing methods do not contain sufficient label information. Therefore, a novel bidirectional information transfer model is proposed in order to create a sufficient interaction between intent detection and slot filling with type-aware information enhancement. Such a framework collects more explicit label information from the network's top layer and learns discriminative features from labels. According to the experimental results, our model greatly outperforms previous models and achieves the state-of-the-art performance on the two datasets: ATIS and SNIPS.
CITATION STYLE
Sun, R., Rao, L., & Zhou, X. (2022). A Joint Model of Natural Language Understanding for Human-Computer Conversation in IoT. Wireless Communications and Mobile Computing, 2022. https://doi.org/10.1155/2022/2074035
Mendeley helps you to discover research relevant for your work.