Recalibration of Aleatoric and EpistemicRegression Uncertainty in Medical Imaging

  • Laves M
  • Ihler S
  • Fast J
  • et al.
N/ACitations
Citations of this article
21Readers
Mendeley users who have this article in their library.

Abstract

The consideration of predictive uncertainty in medical imaging with deep learning is of utmost importance. We apply estimation of both aleatoric and epistemic uncertainty by variational Bayesian inference with Monte Carlo dropout to regression tasks and show that predictive uncertainty is systematically underestimated. We apply sigma scaling with a single scalar value; a simple, yet effective calibration method for both types of uncertainty. The performance of our approach is evaluated on a variety of common medical regression data sets using different state-of-the-art convolutional network architectures. In our experiments, sigma scaling is able to reliably recalibrate predictive uncertainty. It is easy to implement and maintains the accuracy. Well-calibrated uncertainty in regression allows robust rejection of unreliable predictions or detection of out-of-distribution samples. Our source code is available at: https://github.com/mlaves/well-calibrated-regression-uncertainty

Cite

CITATION STYLE

APA

Laves, M.-H., Ihler, S., Fast, J. F., Kahrs, L. A., & Ortmaier, T. (2021). Recalibration of Aleatoric and EpistemicRegression Uncertainty in Medical Imaging. Machine Learning for Biomedical Imaging, 1(MIDL 2020), 1–26. https://doi.org/10.59275/j.melba.2021-a6fd

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free