Due to the prevalence of group activities on social networks, group recommendations have received an increasing number of attentions. Most group recommendation methods concentrated on dealing with persistent groups, while little attention has paid to ephemeral groups. Ephemeral groups are formed ad-hoc for one-time activities, and therefore they suffer severely from data sparsity and cold-start problems. To deal with such problems, we propose a pre-training and fine-tuning method called GBERT for improved group recommendations, which employs BERT to enhance the expressivity and capture group-specific preferences of members. In the pre-training stage, GBERT employs three pre-training tasks to alleviate data sparsity and cold-start problem, and learn better user representations. In the fine-tuning stage, an influence-based regulation objective is designed to regulate user and group representations by allocating weights according to each member's influence. Extensive experiments on three public datasets demonstrate its superiority over the state-of-the-art methods for ephemeral group recommendations.
CITATION STYLE
Zhang, S., Zheng, N., & Wang, D. (2022). GBERT: Pre-training User representations for Ephemeral Group Recommendation. In International Conference on Information and Knowledge Management, Proceedings (pp. 2631–2639). Association for Computing Machinery. https://doi.org/10.1145/3511808.3557330
Mendeley helps you to discover research relevant for your work.