Abstract
Graph Neural Networks (GNNs) have recently received significant research attention due to their superior performance on a variety of graph-related learning tasks. Most of the current works focus on either static or dynamic graph settings, addressing a single particular task, e.g., node/graph classification, link prediction. In this work, we investigate the question: can GNNs be applied to continuously learning a sequence of tasks? Towards that, we explore the Continual Graph Learning (CGL) paradigm and present the Experience Replay based framework ER-GNN for CGL to alleviate the catastrophic forgetting problem in existing GNNs. ER-GNN stores knowledge from previous tasks as experiences and replays them when learning new tasks to mitigate the catastrophic forgetting issue. We propose three experience node selection strategies: mean of feature, coverage maximization, and influence maximization, to guide the process of selecting experience nodes. Extensive experiments on three benchmark datasets demonstrate the effectiveness of our ER-GNN and shed light on the incremental graph (non-Euclidean) structure learning.
Cite
CITATION STYLE
Zhou, F., & Cao, C. (2021). Overcoming Catastrophic Forgetting in Graph Neural Networks with Experience Replay. In 35th AAAI Conference on Artificial Intelligence, AAAI 2021 (Vol. 5B, pp. 4714–4722). Association for the Advancement of Artificial Intelligence. https://doi.org/10.1609/aaai.v35i5.16602
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.