Improving Conversational Recommender Systems via Transformer-based Sequential Modelling

29Citations
Citations of this article
20Readers
Mendeley users who have this article in their library.
Get full text

Abstract

In Conversational Recommender Systems (CRSs), conversations usually involve a set of related items and entities e.g., attributes of items. These items and entities are mentioned in order following the development of a dialogue. In other words, potential sequential dependencies exist in conversations. However, most of the existing CRSs neglect these potential sequential dependencies. In this paper, we propose a Transformer-based sequential conversational recommendation method, named TSCR, which models the sequential dependencies in the conversations to improve CRS. We represent conversations by items and entities, and construct user sequences to discover user preferences by considering both mentioned items and entities. Based on the constructed sequences, we deploy a Cloze task to predict the recommended items along a sequence. Experimental results demonstrate that our TSCR model significantly outperforms state-of-the-art baselines.

Cite

CITATION STYLE

APA

Zou, J., Kanoulas, E., Ren, P., Ren, Z., Sun, A., & Long, C. (2022). Improving Conversational Recommender Systems via Transformer-based Sequential Modelling. In SIGIR 2022 - Proceedings of the 45th International ACM SIGIR Conference on Research and Development in Information Retrieval (pp. 2319–2324). Association for Computing Machinery, Inc. https://doi.org/10.1145/3477495.3531852

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free