The availability of annotated legal corpora is crucial for a number of tasks, such as legal search, legal information retrieval, and predictive justice. Annotation is mostly assumed to be a straightforward task: as long as the annotation scheme is well defined and the guidelines are clear, annotators are expected to agree on the labels. This is not always the case, especially in legal annotation, which can be extremely difficult even for expert annotators. We propose a legal annotation procedure that takes into account annotator certainty and improves it through negotiation. We also collect annotator feedback and show that our approach contributes to a positive annotation environment. Our work invites reflection on often neglected ethical concerns regarding legal annotation.
CITATION STYLE
Zanoli, E., Barbini, M., Riva, D., Picascia, S., Furiosi, E., D’Ancona, S., & Chesi, C. (2023). Annotators-in-the-loop: Testing a Novel Annotation Procedure on Italian Case Law. In Proceedings of the Annual Meeting of the Association for Computational Linguistics (pp. 118–128). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2023.law-1.12
Mendeley helps you to discover research relevant for your work.