A Three-step Method for Multi-Hop Inference Explanation Regeneration

1Citations
Citations of this article
45Readers
Mendeley users who have this article in their library.

Abstract

Multi-hop inference for explanation generation is to combine two or more facts to make an inference. The task focuses on generating explanations for elementary science questions. In the task, the relevance between the explanations and the QA pairs is of vital importance. To address the task, a three-step framework is proposed. Firstly, vector distance between two texts is utilized to recall the top-K relevant explanations for each question, reducing the calculation consumption. Then, a selection module is employed to choose those most relative facts in an autoregressive manner, giving a preliminary order for the retrieved facts. Thirdly, we adopt a re-ranking module to re-rank the retrieved candidate explanations with relevance between each fact and the QA pairs. Experimental results illustrate the effectiveness of the proposed framework with an improvement of 39.78% in NDCG over the official baseline.

References Powered by Scopus

Framing QA as building and ranking intersentence answer justifications

33Citations
N/AReaders
Get full text

TextGraphs 2020 Shared Task on Multi-Hop Inference for Explanation Regeneration

11Citations
N/AReaders
Get full text

Cited by Powered by Scopus

TextGraphs 2021 Shared Task on Multi-Hop Inference for Explanation Regeneration

8Citations
N/AReaders
Get full text

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Cite

CITATION STYLE

APA

Xiang, Y., Zhang, Y., Shi, X., Bo, L., Xu, W., & Xi, C. (2021). A Three-step Method for Multi-Hop Inference Explanation Regeneration. In TextGraphs 2021 - Graph-Based Methods for Natural Language Processing, Proceedings of the 15th Workshop - in conjunction with the 2021 Annual Conference of the North American Chapter of the Association for Computational Linguistics, NAACL 2021 (pp. 171–175). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2021.textgraphs-1.19

Readers' Seniority

Tooltip

PhD / Post grad / Masters / Doc 8

57%

Researcher 3

21%

Lecturer / Post doc 2

14%

Professor / Associate Prof. 1

7%

Readers' Discipline

Tooltip

Computer Science 13

68%

Linguistics 4

21%

Neuroscience 1

5%

Social Sciences 1

5%

Save time finding and organizing research with Mendeley

Sign up for free