Commonsense Question Answering is an important natural language processing (NLP) task that aims to predict the correct answer to a question through commonsense reasoning. Previous studies utilize pre-trained models on large-scale corpora such as BERT, or perform reasoning on knowledge graphs. However, these methods do not explicitly model the relations that connect entities, which are informational and can be used to enhance reasoning. To address this issue, we propose a relation-aware reasoning method. Our method uses a relation-aware graph neural network to capture the rich contextual information from both entities and relations. Compared with methods that use fixed relation embeddings from pre-trained models, our model dynamically updates relations with contextual information from a multi-source subgraph, built from multiple external knowledge sources. The enhanced representations of relations are then fed to a bidirectional reasoning module. A bidirectional attention mechanism is applied between the question sequence and the paths that connect entities, which provides us with transparent interpretability. Experimental results on the CommonsenseQA dataset illustrate that our method results in significant improvements over the baselines while also providing clear reasoning paths.
CITATION STYLE
Wang, J., Li, X., Tan, Z., Zhao, X., & Xiao, W. (2021). Relation-aware Bidirectional Path Reasoning for Commonsense Question Answering. In CoNLL 2021 - 25th Conference on Computational Natural Language Learning, Proceedings (pp. 445–453). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2021.conll-1.35
Mendeley helps you to discover research relevant for your work.