Edward Wheatcroft, London School of Economics
The Central Role of Data Assimilation in Probabilistic Forecasting
Abstract: The quality of a data assimilation scheme (including ensemble formation) is reflected directly in the quality of the resulting probabilistic forecast.
Data assimilation schemes which yield ensemble members consistent with the long term dynamics of the model allow a more effective use of resources and are expected to yield better probabilistic forecasts than those which do
In the perfect model setting, Ensemble Kalman Filter approaches are hampered by assumptions of linearity whilst variational approaches are likely to fail due to the existence of local minima.
These constraints remain when the model is imperfect, whilst neither approach provides a consistent treatment of model error.
Pseudo-orbit data assimilation (PDA) provides an attractive, alternative approach to data assimilation by allowing an enhanced balance between the information contained in the dynamic equations and the observations.
It is shown that PDA can yield more skilful probabilistic forecasts than some traditional approaches in both the perfect and imperfect model scenarios.
Operational forecasts are often based on simulations of the latest ensembles, without consideration of ensembles launched previously; the question of combining sequentially launched forecasts of the same target time to yield more skilful forecasts is considered.
When the forecasts represent actual probabilities of the system, a Bayesian approach would provide an optimal approach. It is shown that, whilst the Bayesian approach is indeed effective in the perfect model scenario, it can be counterproductive when the model contains structural imperfection. This is demonstrated in the context of the Lorenz '63 system.
Here again the data assimilation scheme plays a key role, both in the perfect model and imperfect model scenarios. Alternatives to a Bayesian approach can be shown to yield increased skill under pseudo-orbit data assimilation.