A transformer–convolution model for enhanced session-based recommendation

Citations of this article
Mendeley users who have this article in their library.
Get full text


Session-based recommendation aims to predict a user's next action based on a series of anonymous sequences and plays an essential role in various online applications, such as e-commerce and music applications. Recently, transformer-based models have obtained results that are competitive with or even surpass those of recurrent neural networks, because of the good performance of transformer models in capturing long-distance dependencies. However, a transformer has a limited ability to mine local contextual information, which can be regarded as collective features. Researchers are seeking to address this limitation by augmenting the contextual transition to boost session representation learning. Accordingly, in this paper, we enhance the capabilities of a transformer in a session-based recommendation task by introducing convolutional neural networks (CNNs) at the stage of aggregating the item features with long- and short-distance dependencies. We first borrow a self-attention module from the classic transformer model to explore the long-distance dependencies. We next propose horizontal and vertical convolutions for enhancing the local collective information and then obtain a session representation by integrating the two types of features. Extensive experiments on real-world datasets show that our method outperforms those that rely on a transformer or a CNN alone.




Wang, J., Xie, H., Wang, F. L., & Lee, L. K. (2023). A transformer–convolution model for enhanced session-based recommendation. Neurocomputing, 531, 21–33. https://doi.org/10.1016/j.neucom.2023.01.083

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free