Hierarchical context enabled recurrent neural network for recommendation

26Citations
Citations of this article
47Readers
Mendeley users who have this article in their library.

Abstract

A long user history inevitably reflects the transitions of personal interests over time. The analyses on the user history require the robust sequential model to anticipate the transitions and the decays of user interests. The user history is often modeled by various RNN structures, but the RNN structures in the recommendation system still suffer from the long-term dependency and the interest drifts. To resolve these challenges, we suggest HCRNN with three hierarchical contexts of the global, the local, and the temporary interests. This structure is designed to withhold the global long-term interest of users, to reflect the local sub-sequence interests, and to attend the temporary interests of each transition. Besides, we propose a hierarchical context-based gate structure to incorporate our interest drift assumption. As we suggest a new RNN structure, we support HCRNN with a complementary bi-channel attention structure to utilize hierarchical context. We experimented the suggested structure on the sequential recommendation tasks with CiteULike, MovieLens, and LastFM, and our model showed the best performances in the sequential recommendations.

Cite

CITATION STYLE

APA

Song, K., Ji, M., Park, S., & Moon, I. C. (2019). Hierarchical context enabled recurrent neural network for recommendation. In 33rd AAAI Conference on Artificial Intelligence, AAAI 2019, 31st Innovative Applications of Artificial Intelligence Conference, IAAI 2019 and the 9th AAAI Symposium on Educational Advances in Artificial Intelligence, EAAI 2019 (pp. 4983–4991). AAAI Press. https://doi.org/10.1609/aaai.v33i01.33014983

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free