Neural networks have become popular recently in recommendation systems to extract user and item representations. Most previous works follow a two-branch setting, where user and item networks learn user and item representations in the first and second branches, respectively. In the item cold-start problem, where the usage patterns of the items do not exist, the user network uses ID/interaction vector as the input and the item network uses the item side information (content) as the input. In this paper, we will show that by using this structure, two representations are learned for each item in the training set; one is the output of the item network and the other one is hidden inside the user network and is used for learning user representations. Learning two representations makes training slower and optimization more difficult. We propose to unify the two representations and only use the one generated by the item network. Also, we will show how attention mechanisms fit in our setting and how they can improve the quality of the representations. Our results on public and real-world datasets show that our approach converges faster, achieves higher recall in fewer iterations, and is more robust to the changes in the number of training samples compared to the previous works.
CITATION STYLE
Raziperchikolaei, R., Liang, G., & Chung, Y. J. (2021). Shared neural item representations for completely cold start problem. In RecSys 2021 - 15th ACM Conference on Recommender Systems (pp. 422–431). Association for Computing Machinery, Inc. https://doi.org/10.1145/3460231.3474228
Mendeley helps you to discover research relevant for your work.