Multi-Channel Pooling Graph Neural Networks

25Citations
Citations of this article
25Readers
Mendeley users who have this article in their library.

Abstract

Graph pooling is a critical operation to downsample a graph in graph neural networks. Existing coarsening pooling methods (e.g. DiffPool) mostly focus on capturing the global topology structure by assigning the nodes into several coarse clusters, while dropping pooling methods (e.g. SAGPool) try to preserve the local topology structure by selecting the top-k representative nodes. However, there lacks an effective method to integrate the two types of methods so that both the local and the global topology structure of a graph can be well captured. To address this issue, we propose a Multi-channel Graph Pooling method named MuchPool, which captures the local structure, the global structure and node features simultaneously in graph pooling. Specifically, we use two channels to conduct dropping pooling based on the local topology and node features respectively, and one channel to conduct coarsening pooling. Then a cross-channel convolution operation is designed to refine the graph representations of different channels. Finally, the pooling results are aggregated as the final pooled graph. Extensive experiments on six benchmark datasets present the superior performance of MuchPool. The code of this work is publicly available at Github.

Cite

CITATION STYLE

APA

Du, J., Wang, S., Miao, H., & Zhang, J. (2021). Multi-Channel Pooling Graph Neural Networks. In IJCAI International Joint Conference on Artificial Intelligence (pp. 1442–1448). International Joint Conferences on Artificial Intelligence. https://doi.org/10.24963/ijcai.2021/199

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free