Explainable Offline-Online Training of Neural Networks for Parameterizations: A 1D Gravity Wave-QBO Testbed in the Small-Data Regime

10Citations
Citations of this article
19Readers
Mendeley users who have this article in their library.

Abstract

There are different strategies for training neural networks (NNs) as subgrid-scale parameterizations. Here, we use a 1D model of the quasi-biennial oscillation (QBO) and gravity wave (GW) parameterizations as testbeds. A 12-layer convolutional NN that predicts GW forcings for given wind profiles, when trained offline in a big-data regime (100-year), produces realistic QBOs once coupled to the 1D model. In contrast, offline training of this NN in a small-data regime (18-month) yields unrealistic QBOs. However, online re-training of just two layers of this NN using ensemble Kalman inversion and only time-averaged QBO statistics leads to parameterizations that yield realistic QBOs. Fourier analysis of these three NNs' kernels suggests why/how re-training works and reveals that these NNs primarily learn low-pass, high-pass, and a combination of band-pass filters, potentially related to the local and non-local dynamics in GW propagation and dissipation. These findings/strategies generally apply to data-driven parameterizations of other climate processes.

Cite

CITATION STYLE

APA

Pahlavan, H. A., Hassanzadeh, P., & Alexander, M. J. (2024). Explainable Offline-Online Training of Neural Networks for Parameterizations: A 1D Gravity Wave-QBO Testbed in the Small-Data Regime. Geophysical Research Letters, 51(2). https://doi.org/10.1029/2023GL106324

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free