Trending topics in social media content evolve over time, and it is therefore crucial to understand social media users and their interpersonal communications in a dynamic manner. In this research we study dynamic online conversation recommendation, to help users engage in conversations that satisfy their evolving interests. Different from works in conversation recommendation which assume static user interests, our model captures the temporal aspects of user interests. Moreover, our model can cater for cold start problem where conversations are new and unseen in training. We propose a neural architecture to analyze changes of user interactions and interests over time, whose result is used to predict which discussions the users are likely to enter. We conduct experiments on large-scale collections of Reddit conversations. Results on three subreddits show that our model significantly outperforms state-of-the-art models based on static assumption of user interests. We further evaluate performance in cold start, and observe consistently better performance by our model when considering various degrees of sparsity of user's chatting history and conversation contexts. Lastly, our analysis also confirms the change of user interests. This further justify the advantage and efficacy of our model.
CITATION STYLE
Zeng, X., Li, J., Wang, L., Mao, Z., & Wong, K. F. (2020). Dynamic online conversation recommendation. In Proceedings of the Annual Meeting of the Association for Computational Linguistics (pp. 3331–3341). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2020.acl-main.305
Mendeley helps you to discover research relevant for your work.