Device-Cloud Collaborative Learning for Recommendation

31Citations
Citations of this article
51Readers
Mendeley users who have this article in their library.
Get full text

Abstract

With the rapid development of storage and computing power on mobile devices, it becomes critical and popular to deploy models on devices to save onerous communication latencies and to capture real-time features. While quite a lot of works have explored to facilitate on-device learning and inference, most of them focus on dealing with response delay or privacy protection. Little has been done to model the collaboration between the device and the cloud modeling and benefit both sides jointly. To bridge this gap, we are among the first attempts to study the Device-Cloud Collaborative Learning (DCCL) framework. Specifically, we propose a novel MetaPatch learning approach on the device side to efficiently achieve "thousands of people with thousands of models'' given a centralized cloud model. Then, with billions of updated personalized device models, we propose a "model-over-models'' distillation algorithm, namely MoMoDistill, to update the centralized cloud model. Our extensive experiments over a range of datasets with different settings demonstrate the effectiveness of such collaboration on both cloud and devices, especially its superiority to model long-tailed users.

Cite

CITATION STYLE

APA

Yao, J., Wang, F., Jia, K., Han, B., Zhou, J., & Yang, H. (2021). Device-Cloud Collaborative Learning for Recommendation. In Proceedings of the ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (pp. 3865–3874). Association for Computing Machinery. https://doi.org/10.1145/3447548.3467097

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free