Dual Attention Network for Cross-lingual Entity Alignment

9Citations
Citations of this article
61Readers
Mendeley users who have this article in their library.

Abstract

Cross-lingual Entity alignment is an essential part of building a knowledge graph, which can help integrate knowledge among different language knowledge graphs. In the real KGs, there exists an imbalance among the information in the same hierarchy of corresponding entities, which results in the heterogeneity of neighborhood structure, making this task challenging. To tackle this problem, we propose a dual attention network for cross-lingual entity alignment (DAEA). Specifically, our dual attention consists of relation-aware graph attention and hierarchical attention. The relation-aware graph attention aims at selectively aggregating multi-hierarchy neighborhood information to alleviate the difference of heterogeneity among counterpart entities. The hierarchical attention adaptively aggregates the low-hierarchy and the high-hierarchy information, which is beneficial to balance the neighborhood information of counterpart entities and distinguish non-counterpart entities with similar structures. Finally, we treat cross-lingual entity alignment as a process of linking prediction. Experimental results on three real-world cross-lingual entity alignment datasets have shown the effectiveness of DAEA.

Cite

CITATION STYLE

APA

Sun, J., Zhou, Y., & Zong, C. (2020). Dual Attention Network for Cross-lingual Entity Alignment. In COLING 2020 - 28th International Conference on Computational Linguistics, Proceedings of the Conference (pp. 3190–3201). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2020.coling-main.284

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free