Modeling prompt adherence in student essays

64Citations
Citations of this article
128Readers
Mendeley users who have this article in their library.

Abstract

Recently, researchers have begun exploring methods of scoring student essays with respect to particular dimensions of quality such as coherence, technical errors, and prompt adherence. The work on modeling prompt adherence, however, has been focused mainly on whether individual sentences adhere to the prompt. We present a new annotated corpus of essaylevel prompt adherence scores and propose a feature-rich approach to scoring essays along the prompt adherence dimension. Our approach significantly outperforms a knowledge-lean baseline prompt adherence scoring system yielding improvements of up to 16.6%. © 2014 Association for Computational Linguistics.

Cite

CITATION STYLE

APA

Persing, I., & Ng, V. (2014). Modeling prompt adherence in student essays. In 52nd Annual Meeting of the Association for Computational Linguistics, ACL 2014 - Proceedings of the Conference (Vol. 1, pp. 1534–1543). Association for Computational Linguistics (ACL). https://doi.org/10.3115/v1/p14-1144

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free