Knowledge base completion by inference from both relational and literal facts

1Citations
Citations of this article
8Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Knowledge base (KB) completion predicts new facts in a KB by performing inference from the existing facts, which is very important for expanding KBs. Most previous KB completion approaches infer new facts only from the relational facts (facts containing object properties) in KBs. Actually, there are large number of literal facts (facts containing datatype properties) besides the relational ones in most KBs; these literal facts are ignored in the previous approaches. This paper studies how to take the literal facts into account when making inference, aiming to further improve the performance of KB completion. We propose a new approach that consumes both relational and literal facts to predict new facts. Our approach extracts literal features from literal facts, and incorporates them with path-based features extracted from relational facts; a predictive model is then trained on all the features to infer new facts. Experiments on YAGO KB show that our approach outperforms the compared approaches that only take relational facts as input.

Cite

CITATION STYLE

APA

Wang, Z., & Huang, Y. (2019). Knowledge base completion by inference from both relational and literal facts. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 11441 LNAI, pp. 501–513). Springer Verlag. https://doi.org/10.1007/978-3-030-16142-2_39

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free