An experimental annotation method is described, showing promise for a subjective labeling task - discourse coherence quality of essays. Annotators developed personal protocols, reducing front-end resources: protocol development and annotator training. Substantial inter-annotator agreement was achieved for a 4-point scale. Correlational analyses revealed how unique linguistic phenomena were considered in annotation. Systems trained with the annotator data demonstrated utility of the data.
CITATION STYLE
Burstein, J., Somasundaran, S., & Chodorow, M. (2020). Finding your “inner-annotator”: An experiment in annotator independence for rating discourse coherence quality in essays. In LAW 2014 - 8th Linguistic Annotation Workshop, in conjunction with COLING 2014 - Proceedings of the Workshop (pp. 48–53). Association for Computational Linguistics (ACL). https://doi.org/10.3115/v1/w14-4906
Mendeley helps you to discover research relevant for your work.