Representation learning of knowledge graphs with multi-scale capsule network

3Citations
Citations of this article
5Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Representation learning of knowledge graphs has gained wide attention in the field of natural language processing. Most existing knowledge representation models for knowledge graphs embed triples into a continuous low-dimensional vector space through a simple linear transformation. In spite of high computation efficiency, the fitting ability of these models is suboptimal. In this paper, we propose a multi-scale capsule network to model relations between embedding vectors from a deep perspective. We use convolution kernels with different sizes of windows in the convolutional layer inside a Capsule network to extract semantic features of entities and relations in triples. These semantic features are then represented as a continuous vector through a routing process algorithm in the capsule layer. The modulus of this vector is used as the score of confidence of correctness of a triple. Experiments show that the proposed model obtains better performance than state-of-the-art embedding models for the task of knowledge graph completion over two benchmarks, WN18RR and FB15k-237.

Cite

CITATION STYLE

APA

Cheng, J., Yang, Z., Dang, J., Pan, C., & Zhang, F. (2019). Representation learning of knowledge graphs with multi-scale capsule network. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 11871 LNCS, pp. 282–290). Springer. https://doi.org/10.1007/978-3-030-33607-3_31

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free