Data required to calibrate uncertain general circulation model (GCM) parameterizations are often only available in limited regions or time periods, for example, observational data from field campaigns, or data generated in local high‐resolution simulations. This raises the question of where and when to acquire additional data to be maximally informative about parameterizations in a GCM. Here we construct a new ensemble‐based parallel algorithm to automatically target data acquisition to regions and times that maximize the uncertainty reduction, or information gain, about GCM parameters. The algorithm uses a Bayesian framework that exploits a quantified distribution of GCM parameters as a measure of uncertainty. This distribution is informed by time‐averaged climate statistics restricted to local regions and times. The algorithm is embedded in the recently developed calibrate‐emulate‐sample framework, which performs efficient model calibration and uncertainty quantification with only model evaluations, compared with evaluations typically needed for traditional approaches to Bayesian calibration. We demonstrate the algorithm with an idealized GCM, with which we generate surrogates of local data. In this perfect‐model setting, we calibrate parameters and quantify uncertainties in a quasi‐equilibrium convection scheme in the GCM. We consider targeted data that are (a) localized in space for statistically stationary simulations, and (b) localized in space and time for seasonally varying simulations. In these proof‐of‐concept applications, the calculated information gain reflects the reduction in parametric uncertainty obtained from Bayesian inference when harnessing a targeted sample of data. The largest information gain typically, but not always, results from regions near the intertropical convergence zone. Climate models depend on dynamics across many spatial and temporal scales. It is infeasible to resolve all of these scales. Instead, the physics at the smallest scales is represented by parameterization schemes that link what is unresolvable to variables resolved on the grid scale. A dominant source of uncertainty in climate predictions comes from uncertainty in calibrating empirical parameters in such parameterization schemes, and these uncertainties are generally not quantified. The uncertainties can be reduced and quantified with data that may have limited availability in space and time, for example, data from field campaigns or from targeted high‐resolution simulations in limited areas. But the sensitivity of simulated climate statistics, such as precipitation rates, to parameterizations varies in space and time, raising the question of where and when to acquire additional data so as to optimize the information gain from the data. Here we construct an automated algorithm that finds optimal regions and time periods for such data acquisition, to maximize the information the data provides about uncertain parameters. In proof‐of‐concept simulations with an idealized global atmosphere model, we show that our algorithm successfully identifies the informative regions and times, even in cases where physics‐based intuition may lead to sub‐optimal choices. Climate models can be calibrated with targeted data, for example, from limited‐area high‐resolution simulations We propose an algorithm for choosing target sites for data acquisition that are maximally informative about climate model parameters The algorithm is benchmarked in an idealized aquaplanet general circulation model
CITATION STYLE
Dunbar, O. R. A., Howland, M. F., Schneider, T., & Stuart, A. M. (2022). Ensemble‐Based Experimental Design for Targeting Data Acquisition to Inform Climate Models. Journal of Advances in Modeling Earth Systems, 14(9). https://doi.org/10.1029/2022ms002997
Mendeley helps you to discover research relevant for your work.