Towards Self-supervised Learning on Graphs with Heterophily

11Citations
Citations of this article
10Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Recently emerged heterophilous graph neural networks have significantly reduced the reliance on the assumption of graph homophily where linked nodes have similar features and labels. These methods focus on a supervised setting that relies on labeling information heavily and presents the limitations on general graph downstream tasks. In this work, we propose a self-supervised representation learning paradigm on graphs with heterophily (namely HGRL) for improving the generalizability of node representations, where node representations are optimized without any label guidance. Inspired by the designs of existing heterophilous graph neural networks, HGRL learns the node representations by preserving the node original features and capturing informative distant neighbors. Such two properties are obtained through carefully designed pretext tasks that are optimized based on estimated high-order mutual information. Theoretical analysis interprets the connections between HGRL and existing advanced graph neural network designs. Extensive experiments on different downstream tasks demonstrate the effectiveness of the proposed framework.

Cite

CITATION STYLE

APA

Chen, J., Zhu, G., Qi, Y., Yuan, C., & Huang, Y. (2022). Towards Self-supervised Learning on Graphs with Heterophily. In International Conference on Information and Knowledge Management, Proceedings (pp. 201–211). Association for Computing Machinery. https://doi.org/10.1145/3511808.3557478

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free