Natural Answer Generation with Attention over Instances

16Citations
Citations of this article
26Readers
Mendeley users who have this article in their library.

Abstract

Natural answer generation (NAG) is more and more popular in real-world knowledge base question answering (KBQA) systems for being able to automatically generate natural language answers with structured KB. Large-scale community QA-pairs crawled from the Internet could be directly used to train NAG models. However, it is pervasive in these datasets that one question may contain multiple answers of varied quality, and NAG models suffer from the simple principle of equal treatment of these answers. To address this problem, we propose two kinds of attention-based algorithms to handle all answers to a question at a time. Selective attention and self-attention mechanisms are used to dynamically weight the answers to one question during the training process. Specifically, selective attention methods weight the answers using the relationships between all the KB objects the question needs and the generated answers, and self-attention methods weight them according to the generating difficulty. The experiments on the public open-domain community QA dataset demonstrate that Selective-ATT outperforms the state-of-the-art by 10.53% in the entity accuracy, 9.34% in the BLEU score, and 1.19% in the Rouge score.

Cite

CITATION STYLE

APA

Wei, M., & Zhang, Y. (2019). Natural Answer Generation with Attention over Instances. IEEE Access, 7, 61008–61017. https://doi.org/10.1109/ACCESS.2019.2904337

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free