In this paper, we propose the concept of triples’ potential probability. Typically, knowledge graph only contains positive triples. Most of knowledge representation methods treat the replaced triples, which replace the head/tail entities or relations with other entities or relations randomly, as negative triples. Actually, not all triples are absolutely negative triples after substitution. It could be a positive triple essentially, but has not been discovered yet. Considering the problems arising from the above situation, we propose the potential probability to solve it. First, we utilize the co-occurrence of relations and paths in the knowledge graph to find potentially correct probabilities of some negative triples. Then we add these triples with potential probabilities to the training model. Finally, we take the experiments on two translation-based models, TransE and TransH, using four public datasets. Experimental results show that our method greatly enhances the performance of the target embedding models.
CITATION STYLE
Luo, S., & Fang, W. (2018). Potential probability of negative triples in knowledge graph embedding. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 11303 LNCS, pp. 48–58). Springer Verlag. https://doi.org/10.1007/978-3-030-04182-3_5
Mendeley helps you to discover research relevant for your work.