Personalized recommender systems are playing an increasingly important role for online services. Graph Neural Network (GNN) based recommender models have demonstrated a superior capability to model users' interests thanks to rich relational information encoded in graphs. However, with the ever-growing volume of online information and the high computational complexity of training GNNs, it is difficult to perform frequent updates to provide the most up-to-date recommendations. There have been several attempts towards training GNN models in an incremental fashion to enable faster training times and permit more frequent model updates using the latest training data. The main technique is knowledge distillation, which aims to allow model updates while preserving key aspects of the model that were learned from the historical data. In this work, we develop a novel Graph Structure Aware Contrastive Knowledge Distillation for Incremental Learning in recommender systems, which is tailored to focus on the rich relational information in the recommendation context. We combine the contrastive distillation formulation with intermediate layer distillation to inject layer-level supervision. We demonstrate the effectiveness of our proposed distillation framework for GNN based recommendation systems on four commonly used datasets, showing consistent improvement over state-of-the-art alternatives.
CITATION STYLE
Wang, Y., Zhang, Y., & Coates, M. (2021). Graph Structure Aware Contrastive Knowledge Distillation for Incremental Learning in Recommender Systems. In International Conference on Information and Knowledge Management, Proceedings (pp. 3518–3522). Association for Computing Machinery. https://doi.org/10.1145/3459637.3482117
Mendeley helps you to discover research relevant for your work.