Abstract
Transformer-based graph neural networks have accomplished notable achievements by utilizing the self-attention mechanism for message passing in various domains. However, traditional methods overlook the diverse significance of intra-node representations, focusing solely on internode interactions. To overcome this limitation, we propose a DAG (Dual Attention Graph), a novel approach that integrates both intra-node and internode dynamics for node classification tasks. By considering the information exchange process between nodes from dual branches, DAG provides a holistic understanding of information propagation within graphs, enhancing the interpretability of graph-based machine learning applications. The experimental evaluations demonstrate that DAG excels in node classification tasks, outperforming current benchmark models across ten datasets.
Author supplied keywords
Cite
CITATION STYLE
Lin, S., Hong, J., Lang, B., & Huang, L. (2023). DAG: Dual Attention Graph Representation Learning for Node Classification. Mathematics, 11(17). https://doi.org/10.3390/math11173691
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.