Resource-Efficient Training for Large Graph Convolutional Networks with Label-Centric Cumulative Sampling

5Citations
Citations of this article
8Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Graph Convolutional Networks (GCNs) are popular for learning representation of graph data and have a wide range of applications in social networks, recommendation systems, etc. However, training GCN models for large networks is resource intensive and time consuming, which hinders them from real deployment. The existing GCN training methods intended to optimize the sampling of mini-batches for stochastic gradient descent to accelerate training process, which did not reduce the problem size and had limited reduction in computation complexity. In this paper, we argue that a GCN can be trained with a sampled subgraph to produce approximate node representations, which inspires us a novel perspective to accelerate GCN training via network sampling. To this end, we propose a label-centric cumulative sampling (LCS) framework for training GCNs for large graphs. The proposed method constructs a subgraph cumulatively based on probabilistic sampling, and trains the GCN model iteratively to generate approximate node representations. The optimality of LCS is theoretically guaranteed to minimize the bias during node aggregation procedure in GCN training. Extensive experiments based on four real-world network datasets show that the LCS framework accelerates the training for the state-of-the-art GCN models up to 17x without causing noteworthy model accuracy drop.

Cite

CITATION STYLE

APA

Lin, M., Li, W., Li, D., Chen, Y., & Lu, S. (2022). Resource-Efficient Training for Large Graph Convolutional Networks with Label-Centric Cumulative Sampling. In WWW 2022 - Proceedings of the ACM Web Conference 2022 (pp. 1170–1180). Association for Computing Machinery, Inc. https://doi.org/10.1145/3485447.3512165

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free