CoArgue : Fostering Lurkers' Contribution to Collective Arguments in Community-based QA Platforms

12Citations
Citations of this article
18Readers
Mendeley users who have this article in their library.
Get full text

Abstract

In Community-Based Question Answering (CQA) platforms, people can participate in discussions about non-factoid topics by marking their stances, providing premises, or arguing for the opinions they support, which forms "collective arguments". The sustainable development of collective arguments relies on a big contributor base, yet most of the frequent CQA users are lurkers who seldom speak out. With a formative study, we identified detailed obstacles preventing lurkers from contributing to collective arguments. We consequently designed a processing pipeline for extracting and summarizing augmentative elements from question threads. Based on this we built CoArgue, a tool with navigation and chatbot features to support CQA lurkers' motivation and ability in making contributions. Through a within-subject study (N=24), we found that, compared to a Quora-like baseline, participants perceived CoArgue as significantly more useful in enhancing their motivation and ability to join collective arguments and found the experience to be more engaging and productive.

Cite

CITATION STYLE

APA

Liu, C., Zhou, S., Liu, D., Li, J., Huang, Z., & Ma, X. (2023). CoArgue : Fostering Lurkers’ Contribution to Collective Arguments in Community-based QA Platforms. In Conference on Human Factors in Computing Systems - Proceedings. Association for Computing Machinery. https://doi.org/10.1145/3544548.3580932

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free