Contrastive Pre-Training of GNNs on Heterogeneous Graphs

39Citations
Citations of this article
51Readers
Mendeley users who have this article in their library.
Get full text

Abstract

While graph neural networks (GNNs) emerge as the state-of-the-art representation learning methods on graphs, they often require a large amount of labeled data to achieve satisfactory performance, which is often expensive or unavailable. To relieve the label scarcity issue, some pre-training strategies have been devised for GNNs, to learn transferable knowledge from the universal structural properties of the graph. However, existing pre-training strategies are only designed for homogeneous graphs, in which each node and edge belongs to the same type. In contrast, a heterogeneous graph embodies rich semantics, as multiple types of nodes interact with each other via different kinds of edges, which are neglected by existing strategies. In this paper, we propose a novel Contrastive Pre-Training strategy of GNNs on Heterogeneous Graphs (CPT-HG), to capture both the semantic and structural properties in a self-supervised manner. Specifically, we design semantic-aware pre-training tasks at both the relation- and subgraph-levels, and further enhance their representativeness by employing contrastive learning. We conduct extensive experiments on three real-world heterogeneous graphs, and promising results demonstrate the superior ability of our CPT-HG to transfer knowledge to various downstream tasks via pre-training.

Cite

CITATION STYLE

APA

Jiang, X., Lu, Y., Fang, Y., & Shi, C. (2021). Contrastive Pre-Training of GNNs on Heterogeneous Graphs. In International Conference on Information and Knowledge Management, Proceedings (pp. 803–812). Association for Computing Machinery. https://doi.org/10.1145/3459637.3482332

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free