Characterising the area under the curve loss function landscape

4Citations
Citations of this article
11Readers
Mendeley users who have this article in their library.
Get full text

Abstract

One of the most common metrics to evaluate neural network classifiers is the area under the receiver operating characteristic curve (AUC). However, optimisation of the AUC as the loss function during network training is not a standard procedure. Here we compare minimising the cross-entropy (CE) loss and optimising the AUC directly. In particular, we analyse the loss function landscape (LFL) of approximate AUC (appAUC) loss functions to discover the organisation of this solution space. We discuss various surrogates for AUC approximation and show their differences. We find that the characteristics of the appAUC landscape are significantly different from the CE landscape. The approximate AUC loss function improves testing AUC, and the appAUC landscape has substantially more minima, but these minima are less robust, with larger average Hessian eigenvalues. We provide a theoretical foundation to explain these results. To generalise our results, we lastly provide an overview of how the LFL can help to guide loss function analysis and selection.

Cite

CITATION STYLE

APA

Niroomand, M. P., Cafolla, C. T., Morgan, J. W. R., & Wales, D. J. (2022). Characterising the area under the curve loss function landscape. Machine Learning: Science and Technology, 3(1). https://doi.org/10.1088/2632-2153/ac49a9

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free