Cross-Batch Negative Sampling for Training Two-Tower Recommenders

47Citations
Citations of this article
68Readers
Mendeley users who have this article in their library.
Get full text

Abstract

The two-tower architecture has been widely applied for learning item and user representations, which is important for large-scale recommender systems. Many two-tower models are trained using various in-batch negative sampling strategies, where the effects of such strategies inherently rely on the size of mini-batches. However, training two-tower models with a large batch size is inefficient, as it demands a large volume of memory for item and user contents and consumes a lot of time for feature encoding. Interestingly, we find that neural encoders can output relatively stable features for the same input after warming up in the training process. Based on such facts, we propose a simple yet effective sampling strategy called Cross-Batch Negative Sampling (CBNS), which takes advantage of the encoded item embeddings from recent mini-batches to boost the model training. Both theoretical analysis and empirical evaluations demonstrate the effectiveness and the efficiency of CBNS.

Cite

CITATION STYLE

APA

Wang, J., Zhu, J., & He, X. (2021). Cross-Batch Negative Sampling for Training Two-Tower Recommenders. In SIGIR 2021 - Proceedings of the 44th International ACM SIGIR Conference on Research and Development in Information Retrieval (pp. 1632–1636). Association for Computing Machinery, Inc. https://doi.org/10.1145/3404835.3463032

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free