Graph Relational Topic Model with Higher-order Graph Attention Auto-encoders

9Citations
Citations of this article
55Readers
Mendeley users who have this article in their library.

Abstract

Learning low-dimensional representations of networked documents is a crucial task for documents linked in network structures. Relational Topic Models (RTMs) have shown their strengths in modeling both document contents and relations to discover the latent topic semantic representations. However, higher-order correlation structure information among documents is largely ignored in these methods. Therefore, we propose a novel graph relational topic model (GRTM) for document network, to fully explore and mix neighborhood information of documents on each order, based on the Higher-order Graph Attention Network (HGAT) with the log-normal prior in the graph attention. The proposed method can address the aforementioned issue via the information propagation among document-document based on the HGAT probabilistic encoder, to learn efficient networked document representations in the latent topic space, which can fully reflect document contents, along with document connections. Experiments on several real-world document network datasets show that, through fully exploring information in documents and document networks, our model achieves better performance on unsupervised representation learning and outperforms existing competitive methods in various downstream tasks.

Cite

CITATION STYLE

APA

Xie, Q., Huang, J., Du, P., & Peng, M. (2021). Graph Relational Topic Model with Higher-order Graph Attention Auto-encoders. In Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021 (pp. 2604–2613). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2021.findings-acl.230

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free