Automated Metrics for Medical Multi-Document Summarization Disagree with Human Evaluations

3Citations
Citations of this article
17Readers
Mendeley users who have this article in their library.

Abstract

Evaluating multi-document summarization (MDS) quality is difficult. This is especially true in the case of MDS for biomedical literature reviews, where models must synthesize contradicting evidence reported across different documents. Prior work has shown that rather than performing the task, models may exploit shortcuts that are difficult to detect using standard n-gram similarity metrics such as ROUGE. Better automated evaluation metrics are needed, but few resources exist to assess metrics when they are proposed. Therefore, we introduce a dataset of human-assessed summary quality facets and pairwise preferences to encourage and support the development of better automated evaluation methods for literature review MDS. We take advantage of community submissions to the Multi-document Summarization for Literature Review (MSLR) shared task to compile a diverse and representative sample of generated summaries. We analyze how automated summarization evaluation metrics correlate with lexical features of generated summaries, to other automated metrics including several we propose in this work, and to aspects of human-assessed summary quality. We find that not only do automated metrics fail to capture aspects of quality as assessed by humans, in many cases the system rankings produced by these metrics are anti-correlated with rankings according to human annotators.

Cite

CITATION STYLE

APA

Wang, L. L., Otmakhova, Y., DeYoung, J., Truong, T. H., Kuehl, B. E., Bransom, E., & Wallace, B. C. (2023). Automated Metrics for Medical Multi-Document Summarization Disagree with Human Evaluations. In Proceedings of the Annual Meeting of the Association for Computational Linguistics (Vol. 1, pp. 9871–9889). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2023.acl-long.549

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free