Dice Semimetric Losses: Optimizing the Dice Score with Soft Labels

0Citations
Citations of this article
2Readers
Mendeley users who have this article in their library.
Get full text

Abstract

The soft Dice loss (SDL) has taken a pivotal role in numerous automated segmentation pipelines in the medical imaging community. Over the last years, some reasons behind its superior functioning have been uncovered and further optimizations have been explored. However, there is currently no implementation that supports its direct utilization in scenarios involving soft labels. Hence, a synergy between the use of SDL and research leveraging the use of soft labels, also in the context of model calibration, is still missing. In this work, we introduce Dice semimetric losses (DMLs), which (i) are by design identical to SDL in a standard setting with hard labels, but (ii) can be employed in settings with soft labels. Our experiments on the public QUBIQ, LiTS and KiTS benchmarks confirm the potential synergy of DMLs with soft labels (e.g. averaging, label smoothing, and knowledge distillation) over hard labels (e.g. majority voting and random selection). As a result, we obtain superior Dice scores and model calibration, which supports the wider adoption of DMLs in practice. The code is available at https://github.com/zifuwanggg/JDTLosses.

Cite

CITATION STYLE

APA

Wang, Z., Popordanoska, T., Bertels, J., Lemmens, R., & Blaschko, M. B. (2023). Dice Semimetric Losses: Optimizing the Dice Score with Soft Labels. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 14222 LNCS, pp. 475–485). Springer Science and Business Media Deutschland GmbH. https://doi.org/10.1007/978-3-031-43898-1_46

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free