Knowledge Graph Embedding with Atrous Convolution and Residual Learning

30Citations
Citations of this article
84Readers
Mendeley users who have this article in their library.

Abstract

Knowledge graph embedding is an important task and it will benefit lots of downstream applications. Currently, deep neural networks based methods achieve state-of-the-art performance. However, most of these existing methods are very complex and need much time for training and inference. To address this issue, we propose a simple but effective atrous convolution based knowledge graph embedding method. Compared with existing state-of-the-art methods, our method has following main characteristics. First, it effectively increases feature interactions by using atrous convolutions. Second, to address the original information forgotten issue and vanishing/exploding gradient issue, it uses the residual learning method. Third, it has simpler structure but much higher parameter efficiency. We evaluate our method on six benchmark datasets with different evaluation metrics. Extensive experiments show that our model is very effective. On these diverse datasets, it achieves better results than the compared state-of-the-art methods on most of evaluation metrics. The source codes of our model could be found at https://github.com/neukg/AcrE.

Cite

CITATION STYLE

APA

Ren, F., Li, J., Zhang, H., Liu, S., Li, B., Ming, R., & Bai, Y. (2020). Knowledge Graph Embedding with Atrous Convolution and Residual Learning. In COLING 2020 - 28th International Conference on Computational Linguistics, Proceedings of the Conference (pp. 1532–1543). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2020.coling-main.134

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free