A Contrastive Sharing Model for Multi-Task Recommendation

13Citations
Citations of this article
43Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Multi-Task Learning (MTL) has attracted increasing attention in recommender systems. A crucial challenge in MTL is to learn suitable shared parameters among tasks and to avoid negative transfer of information. The most recent sparse sharing models use independent parameter masks, which only activate useful parameters for a task, to choose the useful subnet for each task. However, as all the subnets are optimized in parallel for each task independently, it is faced with the problem of conflict between parameter gradient updates (i.e, parameter conflict problem). To address this challenge, we propose a novel Contrastive Sharing Recommendation model in MTL learning (CSRec). Each task in CSRec learns from the subnet by the independent parameter mask as in sparse sharing models, but a contrastive mask is carefully designed to evaluate the contribution of the parameter to a specific task. The conflict parameter will be optimized relying more on the task which is more impacted by the parameter. Besides, we adopt an alternating training strategy in CSRec, making it possible to self-adaptively update the conflict parameters by fair competitions. We conduct extensive experiments on three real-world large scale datasets, i.e., Tencent Kandian, Ali-CCP and Census-income, showing better effectiveness of our model over state-of-the-art methods for both offline and online MTL recommendation scenarios.

Cite

CITATION STYLE

APA

Bai, T., Xiao, Y., Wu, B., Yang, G., Yu, H., & Nie, J. Y. (2022). A Contrastive Sharing Model for Multi-Task Recommendation. In WWW 2022 - Proceedings of the ACM Web Conference 2022 (pp. 3239–3247). Association for Computing Machinery, Inc. https://doi.org/10.1145/3485447.3512043

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free