You have 3 free guides left 😟
Unlock your guides
You have 3 free guides left 😟
Unlock your guides

13.2 Model calibration, validation, and uncertainty analysis

3 min readjuly 22, 2024

is crucial for improving hydrologic simulations. By adjusting parameters and using optimization algorithms, we can minimize differences between simulated and observed values. This process involves selecting , defining parameter ranges, and evaluating .

Validation and are essential for assessing model reliability. Validation tests the calibrated model on independent data, while uncertainty analysis quantifies the impact of input and parameter variations. These steps help communicate model limitations and guide future improvements.

Model Calibration

Process of model calibration

Top images from around the web for Process of model calibration
Top images from around the web for Process of model calibration
  • Adjust model parameters to improve agreement between simulated and observed hydrologic variables (streamflow, groundwater levels)
    • Objective functions quantify difference between simulated and observed values
      • (RMSE), (NSE), percent bias (PBIAS)
    • Optimization algorithms search parameter space to find best set of parameters that minimize objective function
      • (Levenberg-Marquardt), (genetic algorithms, particle swarm optimization)
    • Performance metrics assess goodness-of-fit between simulated and observed values
      • (R2R^2), NSE, RMSE, PBIAS
  • involves:
    1. Select appropriate objective functions and performance metrics based on study objectives and available data
    2. Define reasonable ranges for model parameters based on prior knowledge or literature values
    3. Run optimization algorithm to find best set of parameters that minimize objective function
    4. Evaluate calibrated model's performance using performance metrics and visual inspection of simulated and observed hydrographs

Model Validation and Uncertainty Analysis

Model validation with datasets

  • Assess performance of calibrated model on independent dataset not used in calibration process
    • : divide available data into calibration and validation periods
    • : apply calibrated model to similar watershed with comparable characteristics (area, land use, climate)
  • Use validation metrics similar to calibration process (R2R^2, NSE, RMSE, PBIAS)
  • Well-performing model should have similar performance metrics for both calibration and validation periods
  • Poor performance during validation may indicate model overfitting, inadequate model structure, or differences in watershed characteristics between calibration and validation periods

Uncertainty quantification in models

  • assesses impact of changes in model inputs or parameters on model outputs
    • : vary one parameter at a time while keeping others constant
    • : vary multiple parameters simultaneously to assess their relative importance
      • , ,
  • propagates uncertainties in model inputs and parameters through model to obtain distribution of model outputs
    • Randomly sample input and parameter values from respective probability distributions
    • Run model multiple times using sampled values to generate distribution of model outputs
  • combines prior knowledge about model parameters with observed data to estimate posterior parameter distributions
    • (MCMC) methods sample from posterior distribution to characterize parameter uncertainties
      • ,
  • Assess uncertainty in model structure by comparing performance of different model formulations or using

Interpretation of uncertainty analysis

  • Uncertainty analysis provides range of plausible model outcomes rather than single deterministic prediction
  • Consider (mean, median) and (standard deviation, interquartile range) of model output distribution for decision-making
  • Construct or to communicate uncertainty associated with model predictions
  • Reliability of model predictions depends on quality and quantity of input data, appropriateness of model assumptions, and robustness of calibration and validation process
  • Clearly communicate limitations of model (inability to capture certain processes, applicable spatial and temporal scales, potential sources of uncertainty not accounted for)
  • Use uncertainty analysis to guide future data collection efforts and model improvements by identifying most influential sources of uncertainty
© 2024 Fiveable Inc. All rights reserved.
AP® and SAT® are trademarks registered by the College Board, which is not affiliated with, and does not endorse this website.


© 2024 Fiveable Inc. All rights reserved.
AP® and SAT® are trademarks registered by the College Board, which is not affiliated with, and does not endorse this website.

© 2024 Fiveable Inc. All rights reserved.
AP® and SAT® are trademarks registered by the College Board, which is not affiliated with, and does not endorse this website.
Glossary
Glossary