Knowledge-Enhanced Natural Language Inference Based on Knowledge Graphs

12Citations
Citations of this article
82Readers
Mendeley users who have this article in their library.

Abstract

Natural Language Inference (NLI) is a vital task in natural language processing. It aims to identify the logical relationship between two sentences. Most of the existing approaches make such inference based on semantic knowledge obtained through training corpus. The adoption of background knowledge is rarely seen or limited to a few specific types. In this paper, we propose a novel Knowledge Graph-enhanced NLI (KGNLI) model to leverage the usage of background knowledge stored in knowledge graphs in the field of NLI. KGNLI model consists of three components: a semantic-relation representation module, a knowledge-relation representation module, and a label prediction module. Different from previous methods, various kinds of background knowledge can be flexibly combined in the proposed KGNLI model. Experiments on four benchmarks, SNLI, MultiNLI, SciTail, and BNLI, validate the effectiveness of our model.

Cite

CITATION STYLE

APA

Wang, Z., Li, L., & Zeng, D. (2020). Knowledge-Enhanced Natural Language Inference Based on Knowledge Graphs. In COLING 2020 - 28th International Conference on Computational Linguistics, Proceedings of the Conference (pp. 6498–6508). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2020.coling-main.571

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free