Learning Continuous Graph Structure with Bilevel Programming for Graph Neural Networks

2Citations
Citations of this article
6Readers
Mendeley users who have this article in their library.

Abstract

Learning graph structure for graph neural networks (GNNs) is crucial to facilitate the GNN-based downstream learning tasks. It is challenging due to the non-differentiable discrete graph structure and lack of ground-truth. In this paper, we address these problems and propose a novel graph structure learning framework for GNNs. Firstly, we directly model the continuous graph structure with dual-normalization, which implicitly imposes sparse constraint and reduces the influence of noisy edges. Secondly, we formulate the whole learning process as a bilevel programming problem, where the inner objective is to optimize the GNN given the learned graph, while the outer objective is to optimize the graph structure to minimize the generalization error on downstream task. Moreover, for bilevel optimization, we propose an improved Neumann-IFT algorithm to obtain an approximate solution, which is more stable and accurate than existing optimization methods. Besides, it makes the bilevel optimization process memory-efficient and scalable to large graphs. Experiments on node classification and scene graph generation show that our method can outperform related methods, especially with noisy graphs.

Cite

CITATION STYLE

APA

Hu, M., Chang, H., Ma, B., & Shan, S. (2022). Learning Continuous Graph Structure with Bilevel Programming for Graph Neural Networks. In IJCAI International Joint Conference on Artificial Intelligence (pp. 3057–3063). International Joint Conferences on Artificial Intelligence. https://doi.org/10.24963/ijcai.2022/424

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free