Rubric reliability and annotation of content and argument in source-based argument essays

8Citations
Citations of this article
84Readers
Mendeley users who have this article in their library.

Abstract

We present a unique dataset of student sourcebased argument essays to facilitate research on the relations between content, argumentation skills, and assessment. Two classroom writing assignments were given to college students in a STEM major, accompanied by a carefully designed rubric. The paper presents a reliability study of the rubric, showing it to be highly reliable, and initial annotation on content and argumentation annotation of the essays.

Cite

CITATION STYLE

APA

Gao, Y., Driban, A., McManus, B. X., Musi, E., Davies, P. M., Muresan, S., & Passonneau, R. J. (2019). Rubric reliability and annotation of content and argument in source-based argument essays. In ACL 2019 - Innovative Use of NLP for Building Educational Applications, BEA 2019 - Proceedings of the 14th Workshop (pp. 507–518). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/w19-4452

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free