GBERT: Pre-training User representations for Ephemeral Group Recommendation

9Citations
Citations of this article
13Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Due to the prevalence of group activities on social networks, group recommendations have received an increasing number of attentions. Most group recommendation methods concentrated on dealing with persistent groups, while little attention has paid to ephemeral groups. Ephemeral groups are formed ad-hoc for one-time activities, and therefore they suffer severely from data sparsity and cold-start problems. To deal with such problems, we propose a pre-training and fine-tuning method called GBERT for improved group recommendations, which employs BERT to enhance the expressivity and capture group-specific preferences of members. In the pre-training stage, GBERT employs three pre-training tasks to alleviate data sparsity and cold-start problem, and learn better user representations. In the fine-tuning stage, an influence-based regulation objective is designed to regulate user and group representations by allocating weights according to each member's influence. Extensive experiments on three public datasets demonstrate its superiority over the state-of-the-art methods for ephemeral group recommendations.

Cite

CITATION STYLE

APA

Zhang, S., Zheng, N., & Wang, D. (2022). GBERT: Pre-training User representations for Ephemeral Group Recommendation. In International Conference on Information and Knowledge Management, Proceedings (pp. 2631–2639). Association for Computing Machinery. https://doi.org/10.1145/3511808.3557330

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free