Fulfilling users' needs and increasing the retention rate of recommendation systems are challenging. Most users have consumed a few items in most systems. Translation-based model performs well on sparse datasets. However, a user and only single previous item are considered for the user suggestion of next items. Alternatively, recurrent neural network utilizes sequential dependency but performs poorly on sparse datasets. We unify both and propose Recurrent Translation-based Network (RTN). RTN utilizes sequences of users' consumed items without limiting interactions between items to the most recent one. The results of conducting experiments on real-world datasets show that RTN outperforms other state-of-the-art approaches on sparse datasets.
CITATION STYLE
Chairatanakul, N., Murata, T., & Liu, X. (2019). Recurrent Translation-Based Network for Top-N Sparse Sequential Recommendation. IEEE Access, 7, 131567–131576. https://doi.org/10.1109/ACCESS.2019.2941083
Mendeley helps you to discover research relevant for your work.