A Session-Based Customer Preference Learning Method by Using the Gated Recurrent Units with Attention Function

13Citations
Citations of this article
23Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

In this paper, we investigate an attention function combined with the gated recurrent units (GRUs), named GRUA, to raise the accuracy of the customer preference prediction. The attention function extracts the important product features by using the time-bias parameter and the term frequency-inverse document frequency parameter for recommending products to a customer in the ongoing session. We show that the attention function with the GRUs can learn the customer's intention in the ongoing session more precisely than the existing session-based recommendation (SBR) methods. The experimental results show that the GRUA outperforms two SBR methods: the stacked denoising autoencoders with collaborative filtering (SDAE/CF) and the GRUs with collaborative filtering (GRU/CF) based on the precision and recall evaluation metrics. The data from three publicly available datasets, the Amazon Product Review dataset, the Xing dataset, and the Yoo-Choose Click dataset, are used to evaluate the performance of the GRUA with the SDAE/CF and the GRU/CF. This paper shows that adopting the attention function into the GRUs can dramatically increase the accuracy of the product recommendation in the SBR.

Cite

CITATION STYLE

APA

Chen, J., & Abdul, A. (2019). A Session-Based Customer Preference Learning Method by Using the Gated Recurrent Units with Attention Function. IEEE Access, 7, 17750–17759. https://doi.org/10.1109/ACCESS.2019.2895647

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free