Semantic parsing-based RDF question answering (QA) systems are to interpret users' natural language questions as query graphs and return answers over RDF repository. However, due to the complexity of linking natural phrases with specific RDF items (e.g., entities and predicates), it remains difficult to understand users' question sentences precisely, hence QA systems may not meet users' expectation, offering wrong answers and dismissing some correct answers. In this paper, we design an Interactive Mechanism aiming for PROmotion Via users' feedback to QA systems (IMPROVE-QA), a whole framework to not only make existing QA systems return more precise answers based on a few feedbacks over the original answers given by RDF QA systems, but also enhance paraphrasing dictionaries to ensure a continuous-learning capability in improving RDF QA systems. To provide better interactivity and online performance, we design a holistic graph mining algorithm (HWspan) to automatically refine the query graph. Extensive experiments on both Freebase and DBpedia confirm the effectiveness and superiority of our approach.
CITATION STYLE
Zhang, X., Zou, L., & Hu, S. (2019). An interactive mechanism to improve question answering systems via feedback. In International Conference on Information and Knowledge Management, Proceedings (pp. 1381–1390). Association for Computing Machinery. https://doi.org/10.1145/3357384.3358059
Mendeley helps you to discover research relevant for your work.