Data-free Knowledge Distillation for Reusing Recommendation Models

0Citations
Citations of this article
6Readers
Mendeley users who have this article in their library.
Get full text

Abstract

A common practice to keep the freshness of an offline Recommender System (RS) is to train models that fit the user's most recent behaviour while directly replacing the outdated historical model. However, many feature engineering and computing resources are used to train these historical models, but they are underutilized in the downstream RS model training. In this paper, to turn these historical models into treasures, we introduce a model inversed data synthesis framework, which can recover training data information from the historical model and use it for knowledge transfer. This framework synthesizes a new form of data from the historical model. Specifically, we 'invert' an off-the-shield pretrained model to synthesize binary class user-item pairs beginning from random noise without requiring any additional information from the training dataset. To synthesize informative data from a pretrained model, we propose a new continuous data type rather than the original one- or multi-hot vectors. An additional statistical regularization is added to further improve the quality of the synthetic data inverted from the deep model with batch normalization. The experimental results show that our framework can generalize across different types of models. We can efficiently train different types of classical Click-Through-Rate (CTR) prediction models from scratch with significantly few inversed synthetic data (2 orders of magnitude). Moreover, our framework can also work well in the knowledge transfer scenarios such as model retraining and data-free knowledge distillation.

Cite

CITATION STYLE

APA

Wang, C., Sun, J., Dong, Z., Zhu, J., Li, Z., Li, R., & Zhang, R. (2023). Data-free Knowledge Distillation for Reusing Recommendation Models. In Proceedings of the 17th ACM Conference on Recommender Systems, RecSys 2023 (pp. 386–395). Association for Computing Machinery, Inc. https://doi.org/10.1145/3604915.3608789

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free