Hierarchical recurrent attention network for response generation

153Citations
Citations of this article
202Readers
Mendeley users who have this article in their library.

Abstract

We study multi-turn response generation in chatbots where a response is generated according to a conversation context. Existing work has modeled the hierarchy of the context, but does not pay enough attention to the fact that words and utterances in the context are differentially important. As a result, they may lose important information in context and generate irrelevant responses. We propose a hierarchical recurrent attention network (HRAN) to model both the hierarchy and the importance variance in a unified framework. In HRAN, a hierarchical attention mechanism attends to important parts within and among utterances with word level attention and utterance level attention respectively. Empirical studies on both automatic evaluation and human judgment show that HRAN can significantly outperform state-of-the-art models for context based response generation.

Cite

CITATION STYLE

APA

Xing, C., Wu, Y., Wu, W., Huang, Y., & Zhou, M. (2018). Hierarchical recurrent attention network for response generation. In 32nd AAAI Conference on Artificial Intelligence, AAAI 2018 (pp. 5610–5617). AAAI press. https://doi.org/10.1609/aaai.v32i1.11965

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free