Alleviating Cold-start Problem in CTR Prediction with A Variational Embedding Learning Framework

15Citations
Citations of this article
29Readers
Mendeley users who have this article in their library.
Get full text

Abstract

We propose a general Variational Embedding Learning Framework (VELF) for alleviating the severe cold-start problem in CTR prediction. VELF addresses the cold start problem via alleviating over-fits caused by data-sparsity in two ways: learning probabilistic embedding, and incorporating trainable and regularized priors which utilize the rich side information of cold start users and advertisements (Ads). The two techniques are naturally integrated into a variational inference framework, forming an end-to-end training process. Abundant empirical tests on benchmark datasets well demonstrate the advantages of our proposed VELF. Besides, extended experiments confirmed that our parameterized and regularized priors provide more generalization capability than traditional fixed priors.

Cite

CITATION STYLE

APA

Xu, X., Yang, C., Yu, Q., Fang, Z., Wang, J., Fan, C., … Shao, J. (2022). Alleviating Cold-start Problem in CTR Prediction with A Variational Embedding Learning Framework. In WWW 2022 - Proceedings of the ACM Web Conference 2022 (pp. 27–35). Association for Computing Machinery, Inc. https://doi.org/10.1145/3485447.3512048

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free