Deep learning techniques have been applied widely in industrial recommendation systems. However, far less attention has been paid on the overfitting problem of models in recommendation systems, which, on the contrary, is recognized as a critical issue for deep neural networks. In the context of Click-Through Rate (CTR) prediction, we observe an interesting one-epoch overfitting problem: the model performance exhibits a dramatic degradation at the beginning of the second epoch. Such a phenomenon has been witnessed widely in real-world applications of CTR models. Thereby, the best performance is usually achieved by training with only one epoch. To understand the underlying factors behind the one-epoch phenomenon, we conduct extensive experiments on the production data set collected from the display advertising system of Alibaba. The results show that the model structure, the optimization algorithm with a fast convergence rate, and the feature sparsity are closely related to the one-epoch phenomenon. We also provide a likely hypothesis for explaining such a phenomenon and conduct a set of proof-of-concept experiments. We hope this work can shed light on the future research on training more epochs for better performance.
CITATION STYLE
Zhang, Z. Y., Sheng, X. R., Zhang, Y., Jiang, B., Han, S., Deng, H., & Zheng, B. (2022). Towards Understanding the Overfitting Phenomenon of Deep Click-Through Rate Models. In International Conference on Information and Knowledge Management, Proceedings (pp. 2671–2680). Association for Computing Machinery. https://doi.org/10.1145/3511808.3557479
Mendeley helps you to discover research relevant for your work.