Recurrent Translation-Based Network for Top-N Sparse Sequential Recommendation

7Citations
Citations of this article
7Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Fulfilling users' needs and increasing the retention rate of recommendation systems are challenging. Most users have consumed a few items in most systems. Translation-based model performs well on sparse datasets. However, a user and only single previous item are considered for the user suggestion of next items. Alternatively, recurrent neural network utilizes sequential dependency but performs poorly on sparse datasets. We unify both and propose Recurrent Translation-based Network (RTN). RTN utilizes sequences of users' consumed items without limiting interactions between items to the most recent one. The results of conducting experiments on real-world datasets show that RTN outperforms other state-of-the-art approaches on sparse datasets.

Cite

CITATION STYLE

APA

Chairatanakul, N., Murata, T., & Liu, X. (2019). Recurrent Translation-Based Network for Top-N Sparse Sequential Recommendation. IEEE Access, 7, 131567–131576. https://doi.org/10.1109/ACCESS.2019.2941083

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free