Sequential recommendation through graph neural networks and transformer encoder with degree encoding

5Citations
Citations of this article
6Readers
Mendeley users who have this article in their library.

Abstract

Predicting users’ next behavior through learning users’ preferences according to the users’ historical behaviors is known as sequential recommendation. In this task, learning sequence representation by modeling the pairwise relationship between items in the sequence to capture their long-range dependencies is crucial. In this paper, we propose a novel deep neural network named graph convolutional network transformer recommender (GCNTRec). GCNTRec is capable of learning effective item representation in a user’s historical behaviors sequence, which involves extracting the correlation between the target node and multi-layer neighbor nodes on the graphs constructed under the heterogeneous information networks in an end-to-end fashion through a graph convolutional network (GCN) with degree encoding, while the capturing long-range dependencies of items in a sequence through the transformer encoder model. Using this multi-dimensional vector representation, items related to the a user historical behavior sequence can be easily predicted. We empirically evaluated GCNTRec on multiple public datasets. The experimental results show that our approach can effectively predict subsequent relevant items and outperforms previous techniques.

Cite

CITATION STYLE

APA

Wang, S., Li, X., Kou, X., Zhang, J., Zheng, S., Wang, J., & Gong, J. (2021). Sequential recommendation through graph neural networks and transformer encoder with degree encoding. Algorithms, 14(9). https://doi.org/10.3390/a14090263

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free