In this paper we present a cross-lingual extension of a neural tensor network model for knowledge base completion. We exploit multilingual synsets from BabelNet to translate English triples to other languages and then augment the reference knowledge base with cross-lingual triples. We project monolingual embeddings of different languages to a shared multilingual space and use them for network initialization (i.e., as initial concept embeddings). We then train the network with triples from the cross-lingually augmented knowledge base. Results on WordNet link prediction show that leveraging cross-lingual information yields significant gains over exploiting only monolingual triples.
CITATION STYLE
Klein, P., Ponzetto, S. P., & Glavas, G. (2017). Improving neural knowledge base completion with cross-lingual projections. In 15th Conference of the European Chapter of the Association for Computational Linguistics, EACL 2017 - Proceedings of Conference (Vol. 2, pp. 516–522). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/e17-2083
Mendeley helps you to discover research relevant for your work.