Embedding knowledge graphs (KGs) into continuous vector spaces is currently an active research area. Soft rules, despite their uncertainty, are highly beneficial to KG embedding. However, they have not been studied enough in recent methods. A major challenge here is how to devise a principled framework, which efficiently and effectively integrates such soft logical information into embedding models. This paper proposes a highly scalable and effective method for preserving soft logical regularities by imposing soft rule constraints on relation representations. Specifically, we first represent relations as bilinear forms and map entity representations into a non-negative and bounded space. Then we derive a rule-based regularization that merely enforces relation representations to satisfy constraints introduced by soft rules. The proposed method has the following advantages: 1) it regularizes relations directly with the complexity of rule learning independent of entity set size, improving scalability; 2) it imposes prior logical information upon the structure of the embedding space, and would be beneficial to knowledge reasoning. Evaluation in link prediction on Freebase and DBpedia shows the effectiveness of our approach over many competitive baselines. Code and datasets are available at https://github.com/StudyGroup-lab/SLRE.
CITATION STYLE
Guo, S., Li, L., Hui, Z., Meng, L., Ma, B., Liu, W., … Zhang, H. (2020). Knowledge Graph Embedding Preserving Soft Logical Regularity. In International Conference on Information and Knowledge Management, Proceedings (pp. 425–434). Association for Computing Machinery. https://doi.org/10.1145/3340531.3412055
Mendeley helps you to discover research relevant for your work.