Generating natural answers on knowledge bases and text by sequence-to-sequence learning

2Citations
Citations of this article
6Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Generative question answering systems aim at generating more contentful responses and more natural answers. Existing generative question answering systems applied to knowledge grounded conversation generate natural answers either with a knowledge base or with raw text. Nevertheless, performance of their methods is often affected by the incompleteness of the KB or text facts. In this paper, we propose an end-to-end generative question answering model. We make use of unstructured text and structured KBs to establish an universal schema as a large external facts library. Each words of a natural answer are dynamically predicted from the common vocabulary and retrieved from the corresponding external facts. And our model can generate natural answer containing arbitrary number of knowledge entities through selecting from multiple relevant external facts by the dynamic knowledge enquirer. Finally, empirical study shows that our model is efficient and outperforms baseline methods significantly in terms of automatic evaluation and human evaluation.

Cite

CITATION STYLE

APA

Ye, Z., Cai, R., Liao, Z., Hao, Z., & Li, J. (2018). Generating natural answers on knowledge bases and text by sequence-to-sequence learning. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 11139 LNCS, pp. 447–455). Springer Verlag. https://doi.org/10.1007/978-3-030-01418-6_44

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free