An annotation protocol for collecting user-generated counter-arguments using crowdsourcing

2Citations
Citations of this article
9Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Constructive feedback is important for improving critical thinking skills. However, little work has been done to automatically generate such feedback for an argument. In this work, we experiment with an annotation protocol for collecting user-generated counter-arguments via crowdsourcing. We conduct two parallel crowdsourcing experiments, where workers are instructed to produce (i) a counter-argument, and (ii) a counter-argument after identifying a fallacy. Our analysis indicates that we can collect counter-arguments that are useful as constructive feedback, especially when workers are first asked to identify a fallacy type.

Cite

CITATION STYLE

APA

Reisert, P., Vallejo, G., Inoue, N., Gurevych, I., & Inui, K. (2019). An annotation protocol for collecting user-generated counter-arguments using crowdsourcing. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 11626 LNAI, pp. 232–236). Springer Verlag. https://doi.org/10.1007/978-3-030-23207-8_43

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free