This paper reviews annotation schemes used for labeling discourse coherence in well-formed and noisy (essay) data, and it describes a system that we have developed for automated holistic scoring of essay coherence. We review previous, related work on unsupervised computational approaches to evaluating discourse coherence and focus on a taxonomy of discourse coherence schemes classified by their different goals and types of data. We illustrate how a holistic approach can be successfully used to build systems for noisy essay data, across domains and populations. We discuss the model features related to human scoring guide criteria for essay scoring, and the importance of using model features relevant to these criteria for the purpose of generating meaningful scores and feedback for students and test-takers. To demonstrate the effectiveness of a holistic annotation scheme, we present results of system evaluations.
CITATION STYLE
Burstein, J., Tetreault, J., & Chodorow, M. (2013). Holistic Discourse Coherence Annotation for Noisy Essay Writing. Dialogue & Discourse, 4(2), 34–52. https://doi.org/10.5087/dad.2013.202
Mendeley helps you to discover research relevant for your work.