Efficient Multi-Task Learning via Generalist Recommender

0Citations
Citations of this article
5Readers
Mendeley users who have this article in their library.

Abstract

Multi-task learning (MTL) is a common machine learning technique that allows the model to share information across different tasks and improve the accuracy of recommendations for all of them. Many existing MTL implementations suffer from scalability issues as the training and inference performance can degrade with the increasing number of tasks, which can limit production use case scenarios for MTL-based recommender systems. Inspired by the recent advances of large language models, we developed an end-to-end efficient and scalable Generalist Recommender (GRec). GRec takes comprehensive data signals by utilizing NLP heads, parallel Transformers, as well as a wide and deep structure to process multimodal inputs. These inputs are then combined and fed through a newly proposed task-sentence level routing mechanism to scale the model capabilities on multiple tasks without compromising performance. Offline evaluations and online experiments show that GRec significantly outperforms our previous recommender solutions. GRec has been successfully deployed on one of the largest telecom websites and apps, effectively managing high volumes of online traffic every day.

Cite

CITATION STYLE

APA

Wang, L., Ruan, J., Tang, C., Huang, K., Zhang, C., & Dai, J. (2023). Efficient Multi-Task Learning via Generalist Recommender. In International Conference on Information and Knowledge Management, Proceedings (pp. 4335–4339). Association for Computing Machinery. https://doi.org/10.1145/3583780.3615229

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free