Dual Contrastive Network for Sequential Recommendation

19Citations
Citations of this article
21Readers
Mendeley users who have this article in their library.

Abstract

Widely applied in today's recommender systems, sequential recommendation predicts the next interacted item for a given user via his/her historical item sequence. However, sequential recommendation suffers data sparsity issue like most recommenders. To extract auxiliary signals from the data, some recent works exploit self-supervised learning to generate augmented data via dropout strategy, which, however, leads to sparser sequential data and obscure signals. In this paper, we propose D ual C ontrastive N etwork (DCN) to boost sequential recommendation, from a new perspective of integrating auxiliary user-sequence for items. Specifically, we propose two kinds of contrastive learning. The first one is the dual representation contrastive learning that minimizes the distances between embeddings and sequence-representations of users/items. The second one is the dual interest contrastive learning which aims to self-supervise the static interest with the dynamic interest of next item prediction via auxiliary training. We also incorporate the auxiliary task of predicting next user for a given item's historical user sequence, which can capture the trends of items preferred by certain types of users. Experiments on benchmark datasets verify the effectiveness of our proposed method. Further ablation study also illustrates the boosting effect of the proposed components upon different sequential models.

Cite

CITATION STYLE

APA

Lin, G., Gao, C., Li, Y., Zheng, Y., Li, Z., Jin, D., & Li, Y. (2022). Dual Contrastive Network for Sequential Recommendation. In SIGIR 2022 - Proceedings of the 45th International ACM SIGIR Conference on Research and Development in Information Retrieval (pp. 2686–2691). Association for Computing Machinery, Inc. https://doi.org/10.1145/3477495.3531918

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free