Inductive Relation Prediction with Logical Reasoning Using Contrastive Representations

17Citations
Citations of this article
30Readers
Mendeley users who have this article in their library.

Abstract

Relation prediction in knowledge graphs (KGs) aims at predicting missing relations in incomplete triples, whereas the dominant embedding paradigm has a restriction on handling unseen entities during testing. In the real-world scenario, the inductive setting is more common because entities in the training process are finite. Previous methods capture an inductive ability by implicit logic in KGs. However, it would be challenging to preciously acquire entity-independent relational semantics of compositional logic rules and to deal with the deficient supervision of logic caused by the scarcity of relational semantics. To this end, we propose a novel graph convolutional network (GCN)-based model LogCo with logical reasoning by contrastive representations. LogCo firstly extracts enclosing subgraphs and relational paths between two entities to supply the entity-independence. Then a contrastive strategy for relational path instances and the subgraph is proposed for the issue of deficient supervision. The contrastive representations are learned for a joint training regime. Finally, prediction results and logic rules for reasoning are attained. Comprehensive experiments on twelve inductive datasets show that LogCo achieves outstanding performance comparing with SOTA inductive baselines.

References Powered by Scopus

Momentum Contrast for Unsupervised Visual Representation Learning

9477Citations
N/AReaders
Get full text

Modeling Relational Data with Graph Convolutional Networks

3752Citations
N/AReaders
Get full text

Representing text for joint embedding of text and knowledge bases

645Citations
N/AReaders
Get full text

Cited by Powered by Scopus

A Survey of Knowledge Graph Reasoning on Graph Types: Static, Dynamic, and Multi-Modal

58Citations
N/AReaders
Get full text

TECHS: Temporal Logical Graph Networks for Explainable Extrapolation Reasoning

39Citations
N/AReaders
Get full text

A review of graph neural networks and pretrained language models for knowledge graph reasoning

3Citations
N/AReaders
Get full text

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Cite

CITATION STYLE

APA

Pan, Y., Liu, J., Zhang, L., Zhao, T., Lin, Q., Hu, X., & Wang, Q. (2022). Inductive Relation Prediction with Logical Reasoning Using Contrastive Representations. In Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing, EMNLP 2022 (pp. 4261–4274). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2022.emnlp-main.286

Readers over time

‘22‘23‘24‘2505101520

Readers' Seniority

Tooltip

PhD / Post grad / Masters / Doc 6

67%

Researcher 2

22%

Lecturer / Post doc 1

11%

Readers' Discipline

Tooltip

Computer Science 9

75%

Medicine and Dentistry 1

8%

Linguistics 1

8%

Neuroscience 1

8%

Save time finding and organizing research with Mendeley

Sign up for free
0