Extrapolating from neural network models: A cautionary tale

12Citations
Citations of this article
12Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

We present three different methods to estimate error bars on the predictions made using a neural network (NN). All of them represent lower bounds for the extrapolation errors. At first, we illustrate the methods through a simple toy model, then, we apply them to some realistic case related to nuclear masses. By using theoretical data simulated either with a liquid-drop model or a Skyrme energy density functional, we benchmark the extrapolation performance of the NN in regions of the Segr chart far away from the ones used for the training and validation. Finally, we discuss how error bars can help identifying when the extrapolation becomes too uncertain and thus not reliable.

Cite

CITATION STYLE

APA

Pastore, A., & Carnini, M. (2021). Extrapolating from neural network models: A cautionary tale. Journal of Physics G: Nuclear and Particle Physics, 48(8). https://doi.org/10.1088/1361-6471/abf08a

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free