On the Trade-off between Over-smoothing and Over-squashing in Deep Graph Neural Networks

14Citations
Citations of this article
13Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Graph Neural Networks (GNNs) have succeeded in various computer science applications, yet deep GNNs underperform their shallow counterparts despite deep learning's success in other domains. Over-smoothing and over-squashing are key challenges when stacking graph convolutional layers, hindering deep representation learning and information propagation from distant nodes. Our work reveals that over-smoothing and over-squashing are intrinsically related to the spectral gap of the graph Laplacian, resulting in an inevitable trade-off between these two issues, as they cannot be alleviated simultaneously. To achieve a suitable compromise, we propose adding and removing edges as a viable approach. We introduce the Stochastic Jost and Liu Curvature Rewiring (SJLR) algorithm, which is computationally efficient and preserves fundamental properties compared to previous curvature-based methods. Unlike existing approaches, SJLR performs edge addition and removal during GNN training while maintaining the graph unchanged during testing. Comprehensive comparisons demonstrate SJLR's competitive performance in addressing over-smoothing and over-squashing.

Cite

CITATION STYLE

APA

Giraldo, J. H., Skianis, K., Bouwmans, T., & Malliaros, F. D. (2023). On the Trade-off between Over-smoothing and Over-squashing in Deep Graph Neural Networks. In International Conference on Information and Knowledge Management, Proceedings (pp. 566–576). Association for Computing Machinery. https://doi.org/10.1145/3583780.3614997

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free