Improving neural knowledge base completion with cross-lingual projections

9Citations
Citations of this article
78Readers
Mendeley users who have this article in their library.

Abstract

In this paper we present a cross-lingual extension of a neural tensor network model for knowledge base completion. We exploit multilingual synsets from BabelNet to translate English triples to other languages and then augment the reference knowledge base with cross-lingual triples. We project monolingual embeddings of different languages to a shared multilingual space and use them for network initialization (i.e., as initial concept embeddings). We then train the network with triples from the cross-lingually augmented knowledge base. Results on WordNet link prediction show that leveraging cross-lingual information yields significant gains over exploiting only monolingual triples.

Cite

CITATION STYLE

APA

Klein, P., Ponzetto, S. P., & Glavas, G. (2017). Improving neural knowledge base completion with cross-lingual projections. In 15th Conference of the European Chapter of the Association for Computational Linguistics, EACL 2017 - Proceedings of Conference (Vol. 2, pp. 516–522). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/e17-2083

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free