Spectral Graph Attention Network with Fast Eigen-approximation

20Citations
Citations of this article
15Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Variants of Graph Neural Networks (GNNs) for representation learning have been proposed recently and achieved fruitful results in various fields. Among them, Graph Attention Network (GAT) first employs a self-attention strategy to learn attention weights for each edge in the spatial domain. However, learning the attentions over edges can only focus on the local information of graphs and greatly increases the computational costs. In this paper, we first introduce the attention mechanism in the spectral domain of graphs and present Spectral Graph Attention Network (SpGAT) that learns representations for different frequency components regarding weighted filters and graph wavelets bases. In this way, SpGAT can better capture global patterns of graphs in an efficient manner with much fewer learned parameters than that of GAT. Further, to reduce the computational cost of SpGAT brought by the eigen-decomposition, we propose a fast approximation variant SpGAT-Cheby. We thoroughly evaluate the performance of SpGAT and SpGAT-Cheby in semi-supervised node classification tasks and verify the effectiveness of the learned attentions in the spectral domain.

Cite

CITATION STYLE

APA

Chang, H., Rong, Y., Xu, T., Huang, W., Sojoudi, S., Huang, J., & Zhu, W. (2021). Spectral Graph Attention Network with Fast Eigen-approximation. In International Conference on Information and Knowledge Management, Proceedings (pp. 2905–2909). Association for Computing Machinery. https://doi.org/10.1145/3459637.3482187

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free