Neural relation extraction with multi-lingual attention

110Citations
Citations of this article
277Readers
Mendeley users who have this article in their library.

Abstract

Relation extraction has been widely used for finding unknown relational facts from the plain text. Most existing methods focus on exploiting mono-lingual data for relation extraction, ignoring massive information from the texts in various languages. To address this issue, we introduce a multi-lingual neural relation extraction framework, which employs monolingual attention to utilize the information within mono-lingual texts and further proposes cross-lingual attention to consider the information consistency and complementarity among cross-lingual texts. Experimental results on real-world datasets show that our model can take advantage of multi-lingual texts and consistently achieve significant improvements on relation extraction as compared with baselines. The source code of this paper can be obtained from https://github.com/thunlp/MNRE.

Cite

CITATION STYLE

APA

Lin, Y., Liu, Z., & Sun, M. (2017). Neural relation extraction with multi-lingual attention. In ACL 2017 - 55th Annual Meeting of the Association for Computational Linguistics, Proceedings of the Conference (Long Papers) (Vol. 1, pp. 34–43). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/P17-1004

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free