Get the point of my utterance! Learning towards effective responses with multi-head attention mechanism

150Citations
Citations of this article
80Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Attention mechanism has become a popular and widely used component in sequence-to-sequence models. However, previous research on neural generative dialogue systems always generates universal responses, and the attention distribution learned by the model always attends to the same semantic aspect. To solve this problem, in this paper, we propose a novel Multi-Head Attention Mechanism (MHAM) for generative dialog systems, which aims at capturing multiple semantic aspects from the user utterance. Further, a regularizer is formulated to force different attention heads to concentrate on certain aspects. The proposed mechanism leads to more informative, diverse, and relevant response generated. Experimental results show that our proposed model outperforms several strong baselines.

Cite

CITATION STYLE

APA

Tao, C., Gao, S., Shang, M., Wu, W., Zhao, D., & Yan, R. (2018). Get the point of my utterance! Learning towards effective responses with multi-head attention mechanism. In IJCAI International Joint Conference on Artificial Intelligence (Vol. 2018-July, pp. 4418–4424). International Joint Conferences on Artificial Intelligence. https://doi.org/10.24963/ijcai.2018/614

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free