ClusterSCL: Cluster-Aware Supervised Contrastive Learning on Graphs

22Citations
Citations of this article
37Readers
Mendeley users who have this article in their library.
Get full text

Abstract

We study the problem of supervised contrastive (SupCon) learning on graphs. The SupCon loss has been recently proposed for classification tasks by pulling data points in the same class closer than those of different classes. However, it could be difficult for SupCon to handle datasets with large intra-class variances and high inter-class similarities. This issue is also challenging when it couples with graph structures. To address this, we present the cluster-aware supervised contrastive learning loss (ClusterSCL1) for graph learning tasks. The main idea of ClusterSCL is to retain the structural and attribute properties of a graph in the form of nodes' cluster distributions during supervised contrastive learning. Specifically, ClusterSCL introduces the strategy of cluster-aware data augmentation and integrates it with the SupCon loss. Extensive experiments on several widely adopted graph benchmarks demonstrate the superiority of ClusterSCL over the cross-entropy, SupCon, and other graph contrastive objectives.

Cite

CITATION STYLE

APA

Wang, Y., Zhang, J., Li, H., Dong, Y., Yin, H., Li, C., & Chen, H. (2022). ClusterSCL: Cluster-Aware Supervised Contrastive Learning on Graphs. In WWW 2022 - Proceedings of the ACM Web Conference 2022 (pp. 1611–1621). Association for Computing Machinery, Inc. https://doi.org/10.1145/3485447.3512207

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free