Sensitivity, Calibration, Validation, Verification

1Citations
Citations of this article
2Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Formulating the model equations and making them run is just the beginning of the modeling process. First you need to make the model output represent data as well as possible. This can be achieved in part by tweaking the parameters of the model. This is the process of model calibration. Then we need to check that the model really does what it was designed to do. This model testing may assume various procedures, and stages, some of which are called validation and verification. For example, we may want to double check that the model is based on correct assumptions, that the code has no bugs, and that the output is properly presented and interpreted. This would be the model verification stage; or we may want to run the model on an independent set of input data and see how it performs then, which will be called the validation process in some cases. There is still some confusion on terminology and sometimes the words validation and verification are used interchangeably. In any case these are extremely important stages of model analysis that are required to prove the quality of the model; however, neither of the formal methods of model analysis should be overestimated in determining the model usability. After all, the model is good as long as it helps achieve the goals of the project. The overall model performance is more important than how well it did on individual tests and comparisons.

Cite

CITATION STYLE

APA

Voinov, A. A. (2008). Sensitivity, Calibration, Validation, Verification. In Encyclopedia of Ecology, Five-Volume Set (Vol. 1–5, pp. 3221–3227). Elsevier. https://doi.org/10.1016/B978-008045405-4.00238-X

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free