Sequential recommender systems aim to model users’ evolving interests from their historical behaviors, and hence make customized time-relevant recommendations. Compared with traditional models, deep learning approaches such as CNN and RNN have achieved remarkable advancements in recommendation tasks. Recently, the BERT framework also emerges as a promising method, benefited from its self-attention mechanism in processing sequential data. However, one limitation of the original BERT framework is that it only considers one input source of the natural language tokens. It is still an open question to leverage various types of information under the BERT framework. Nonetheless, it is intuitively appealing to utilize other side information, such as item category or tag, for more comprehensive depictions and better recommendations. In our pilot experiments, we found naive approaches, which directly fuse types of side information into the item embeddings, usually bring very little or even negative effects. Therefore, in this paper, we propose the NOninVasive self-Attention mechanism (NOVA) to leverage side information effectively under the BERT framework. NOVA makes use of side information to generate better attention distribution, rather than directly altering the item embeddings, which may cause information overwhelming. We validate the NOVA-BERT model on both public and commercial datasets, and our method can stably outperform the state-of-the-art models with negligible computational overheads.
CITATION STYLE
Liu, C., Li, X., Cai, G., Dong, Z., Zhu, H., & Shang, L. (2021). Non-invasive Self-attention for Side Information Fusion in Sequential Recommendation. In 35th AAAI Conference on Artificial Intelligence, AAAI 2021 (Vol. 5B, pp. 4249–4256). Association for the Advancement of Artificial Intelligence. https://doi.org/10.1609/aaai.v35i5.16549
Mendeley helps you to discover research relevant for your work.