Feature Overcorrelation in Deep Graph Neural Networks: A New Perspective

18Citations
Citations of this article
40Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Recent years have witnessed remarkable success achieved by graph neural networks (GNNs) in many real-world applications such as recommendation and drug discovery. Despite the success, oversmoothing has been identified as one of the key issues which limit the performance of deep GNNs. It indicates that the learned node representations are highly indistinguishable due to the stacked aggregators. In this paper, we propose a new perspective to look at the performance degradation of deep GNNs, i.e., feature overcorrelation. Through empirical and theoretical study on this matter, we demonstrate the existence of feature overcorrelation in deeper GNNs and reveal potential reasons leading to this issue. To reduce the feature correlation, we propose a general framework DeCorr which can encourage GNNs to encode less redundant information. Extensive experiments have demonstrated that DeCorr can help enable deeper GNNs and is complementary to existing techniques tackling the oversmoothing issue.

Cite

CITATION STYLE

APA

Jin, W., Liu, X., Ma, Y., Aggarwal, C., & Tang, J. (2022). Feature Overcorrelation in Deep Graph Neural Networks: A New Perspective. In Proceedings of the ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (pp. 709–719). Association for Computing Machinery. https://doi.org/10.1145/3534678.3539445

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free