scholarly journals Diagnosis and Optimization of Ensemble Forecasts

2008 ◽  
Vol 136 (3) ◽  
pp. 1054-1074 ◽  
Author(s):  
Tomislava Vukicevic ◽  
Isidora Jankov ◽  
John McGinley

Abstract In the current study, a technique that offers a way to evaluate ensemble forecast uncertainties produced either by initial conditions or different model versions, or both, is presented. The technique consists of first diagnosing the performance of the forecast ensemble and then optimizing the ensemble forecast using results of the diagnosis. The technique is based on the explicit evaluation of probabilities that are associated with the Gaussian stochastic representation of the weather analysis and forecast. It combines an ensemble technique for evaluating the analysis error covariance and the standard Monte Carlo approach for computing samples from a known Gaussian distribution. The technique was demonstrated in a tutorial manner on two relatively simple examples to illustrate the impact of ensemble characteristics including ensemble size, various observation strategies, and configurations including different model versions and varying initial conditions. In addition, the authors assessed improvements in the consensus forecasts gained by optimal weighting of the ensemble members based on time-varying, prior-probabilistic skill measures. The results with different observation configurations indicate that, as observations become denser, there is a need for larger-sized ensembles and/or more accuracy among individual members for the ensemble forecast to exhibit prediction skill. The main conclusions relative to ensembles built up with different physics configurations were, first, that almost all members typically exhibited some skill at some point in the model run, suggesting that all should be retained to acquire the best consensus forecast; and, second, that the normalized probability metric can be used to determine what sets of weights or physics configurations are performing best. A comparison of forecasts derived from a simple ensemble mean to forecasts from a mean developed from variably weighting the ensemble members based on prior performance by the probabilistic measure showed that the latter had substantially reduced mean absolute error. The study also indicates that a weighting scheme that utilized more prior cycles showed additional reduction in forecast error.

2014 ◽  
Vol 142 (12) ◽  
pp. 4519-4541 ◽  
Author(s):  
Glen S. Romine ◽  
Craig S. Schwartz ◽  
Judith Berner ◽  
Kathryn R. Fossell ◽  
Chris Snyder ◽  
...  

Abstract Ensembles provide an opportunity to greatly improve short-term prediction of local weather hazards, yet generating reliable predictions remain a significant challenge. In particular, convection-permitting ensemble forecast systems (CPEFSs) have persistent problems with underdispersion. Representing initial and or lateral boundary condition uncertainty along with forecast model error provides a foundation for building a more dependable CPEFS, but the best practice for ensemble system design is not well established. Several configurations of CPEFSs are examined where ensemble forecasts are nested within a larger domain, drawing initial conditions from a downscaled, continuously cycled, ensemble data assimilation system that provides state-dependent initial condition uncertainty. The control ensemble forecast, with initial condition uncertainty only, is skillful but underdispersive. To improve the reliability of the ensemble forecasts, the control ensemble is supplemented with 1) perturbed lateral boundary conditions; or, model error representation using either 2) stochastic kinetic energy backscatter or 3) stochastically perturbed parameterization tendencies. Forecasts are evaluated against stage IV accumulated precipitation analyses and radiosonde observations. Perturbed ensemble forecasts are also compared to the control forecast to assess the relative impact from adding forecast perturbations. For precipitation forecasts, all perturbation approaches improve ensemble reliability relative to the control CPEFS. Deterministic ensemble member forecast skill, verified against radiosonde observations, decreases when forecast perturbations are added, while ensemble mean forecasts remain similarly skillful to the control.


2007 ◽  
Vol 135 (4) ◽  
pp. 1424-1438 ◽  
Author(s):  
Andrew R. Lawrence ◽  
James A. Hansen

Abstract An ensemble-based data assimilation approach is used to transform old ensemble forecast perturbations with more recent observations for the purpose of inexpensively increasing ensemble size. The impact of the transformations are propagated forward in time over the ensemble’s forecast period without rerunning any models, and these transformed ensemble forecast perturbations can be combined with the most recent ensemble forecast to sensibly increase forecast ensemble sizes. Because the transform takes place in perturbation space, the transformed perturbations must be centered on the ensemble mean from the most recent forecasts. Thus, the benefit of the approach is in terms of improved ensemble statistics rather than improvements in the mean. Larger ensemble forecasts can be used for numerous purposes, including probabilistic forecasting, targeted observations, and to provide boundary conditions to limited-area models. This transformed lagged ensemble forecasting approach is explored and is shown to give positive results in the context of a simple chaotic model. By incorporating a suitable perturbation inflation factor, the technique was found to generate forecast ensembles whose skill were statistically comparable to those produced by adding nonlinear model integrations. Implications for ensemble forecasts generated by numerical weather prediction models are briefly discussed, including multimodel ensemble forecasting.


2010 ◽  
Vol 138 (7) ◽  
pp. 2930-2952 ◽  
Author(s):  
Andrea Alessandri ◽  
Andrea Borrelli ◽  
Simona Masina ◽  
Annalisa Cherchi ◽  
Silvio Gualdi ◽  
...  

Abstract The development of the Istituto Nazionale di Geofisica e Vulcanologia (INGV)–Centro Euro-Mediterraneo per i Cambiamenti Climatici (CMCC) Seasonal Prediction System (SPS) is documented. In this SPS the ocean initial-conditions estimation includes a reduced-order optimal interpolation procedure for the assimilation of temperature and salinity profiles at the global scale. Nine-member ensemble forecasts have been produced for the period 1991–2003 for two starting dates per year in order to assess the impact of the subsurface assimilation in the ocean for initialization. Comparing the results with control simulations (i.e., without assimilation of subsurface profiles during ocean initialization), it is shown that the improved ocean initialization increases the skill in the prediction of tropical Pacific sea surface temperatures of the system for boreal winter forecasts. Considering the forecast of the 1997/98 El Niño, the data assimilation in the ocean initial conditions leads to a considerable improvement in the representation of its onset and development. The results presented in this paper indicate a better prediction of global-scale surface climate anomalies for the forecasts started in November, probably because of the improvement in the tropical Pacific. For boreal winter, significant increases in the capability of the system to discriminate above-normal and below-normal temperature anomalies are shown in both the tropics and extratropics.


2009 ◽  
Vol 137 (10) ◽  
pp. 3388-3406 ◽  
Author(s):  
Ryan D. Torn ◽  
Gregory J. Hakim

Abstract An ensemble Kalman filter based on the Weather Research and Forecasting (WRF) model is used to generate ensemble analyses and forecasts for the extratropical transition (ET) events associated with Typhoons Tokage (2004) and Nabi (2005). Ensemble sensitivity analysis is then used to evaluate the relationship between forecast errors and initial condition errors at the onset of transition, and to objectively determine the observations having the largest impact on forecasts of these storms. Observations from rawinsondes, surface stations, aircraft, cloud winds, and cyclone best-track position are assimilated every 6 h for a period before, during, and after transition. Ensemble forecasts initialized at the onset of transition exhibit skill similar to the operational Global Forecast System (GFS) forecast and to a WRF forecast initialized from the GFS analysis. WRF ensemble forecasts of Tokage (Nabi) are characterized by relatively large (small) ensemble variance and greater (smaller) sensitivity to the initial conditions. In both cases, the 48-h forecast of cyclone minimum SLP and the RMS forecast error in SLP are most sensitive to the tropical cyclone position and to midlatitude troughs that interact with the tropical cyclone during ET. Diagnostic perturbations added to the initial conditions based on ensemble sensitivity reduce the error in the storm minimum SLP forecast by 50%. Observation impact calculations indicate that assimilating approximately 40 observations in regions of greatest initial condition sensitivity produces a large, statistically significant impact on the 48-h cyclone minimum SLP forecast. For the Tokage forecast, assimilating the single highest impact observation, an upper-tropospheric zonal wind observation from a Mongolian rawinsonde, yields 48-h forecast perturbations in excess of 10 hPa and 60 m in SLP and 500-hPa height, respectively.


2009 ◽  
Vol 22 (10) ◽  
pp. 2526-2540 ◽  
Author(s):  
Li Shi ◽  
Oscar Alves ◽  
Harry H. Hendon ◽  
Guomin Wang ◽  
David Anderson

Abstract The impact of stochastic intraseasonal variability on the onset of the 1997/98 El Niño was examined using a large ensemble of forecasts starting on 1 December 1996, produced using the Australian Bureau of Meteorology Predictive Ocean Atmosphere Model for Australia (POAMA) seasonal forecast coupled model. This coupled model has a reasonable simulation of El Niño and the Madden–Julian oscillation, so it provides an ideal framework for investigating the interaction between the MJO and El Niño. The experiment was designed so that the ensemble spread was simply a result of internal stochastic variability that is generated during the forecast. For the initial conditions used here, all forecasts led to warm El Niño–type conditions with the amplitude of the warming varying from 0.5° to 2.7°C in the Niño-3.4 region. All forecasts developed an MJO event during the first 4 months, indicating that perhaps the background state favored MJO development. However, the details of the MJOs that developed during December 1996–March 1997 had a significant impact on the subsequent strength of the El Niño event. In particular, the forecasts with the initial MJOs that extended farther into the central Pacific, on average, led to a stronger El Niño, with the westerly winds in the western Pacific associated with the MJO leading the development of SST and thermocline anomalies in the central and eastern Pacific. These results imply a limit to the accuracy with which the strength of El Niño can be predicted because the details of individual MJO events matter. To represent realistic uncertainty, coupled models should be able to represent the MJO, including its propagation into the central Pacific so that forecasts produce sufficient ensemble spread.


2020 ◽  
Author(s):  
Sam Allen ◽  
Christopher Ferro ◽  
Frank Kwasniok

<p>A number of realizations of one or more numerical weather prediction (NWP) models, initialised at a variety of initial conditions, compose an ensemble forecast. These forecasts exhibit systematic errors and biases that can be corrected by statistical post-processing. Post-processing yields calibrated forecasts by analysing the statistical relationship between historical forecasts and their corresponding observations. This article aims to extend post processing methodology to incorporate atmospheric circulation. The circulation, or flow, is largely responsible for the weather that we experience and it is hypothesized here that relationships between the NWP model and the atmosphere depend upon the prevailing flow. Numerous studies have focussed on the tendency of this flow to reduce to a set of recognisable arrangements, known as regimes, which recur and persist at fixed geographical locations. This dynamical phenomenon allows the circulation to be categorized into a small number of regime states. In a highly idealized model of the atmosphere, the Lorenz ‘96 system, ensemble forecasts are subjected to well-known post-processing techniques conditional on the system's underlying regime. Two different variables, one of the state variables and one related to the energy of the system, are forecasted and considerable improvements in forecast skill upon standard post-processing are seen when the distribution of the predictand varies depending on the regime. Advantages of this approach and its inherent challenges are discussed, along with potential extensions for operational forecasters.</p>


2019 ◽  
Vol 76 (9) ◽  
pp. 2653-2672 ◽  
Author(s):  
John R. Lawson

Abstract Thunderstorms are difficult to predict because of their small length scale and fast predictability destruction. A cell’s predictability is constrained by properties of the flow in which it is embedded (e.g., vertical wind shear), and associated instabilities (e.g., convective available potential energy). To assess how predictability of thunderstorms changes with environment, two groups of 780 idealized simulations (each using a different microphysics scheme) were performed over a range of buoyancy and shear profiles. Results were not sensitive to the scheme chosen. The gradient in diagnostics (updraft speed, storm speed, etc.) across shear–buoyancy phase space represents sensitivity to small changes in initial conditions: a proxy for inherent predictability. Storm evolution is split into two groups, separated by a U-shaped bifurcation in phase space, comprising 1) cells that continue strengthening after 1 h versus 2) those that weaken. Ensemble forecasts in regimes near this bifurcation are hence expected to have larger uncertainty, and adequate dispersion and reliability is essential. Predictability loss takes two forms: (i) chaotic error growth from the largest and most powerful storms, and (ii) tipping points at the U-shaped perimeter of the stronger storms. The former is associated with traditional forecast error between corresponding grid points, and is here counterintuitive; the latter is associated with object-based error, and matches the mental filtering performed by human forecasters for the convective scale.


2020 ◽  
Vol 148 (2) ◽  
pp. 849-855
Author(s):  
Fenwick C. Cooper ◽  
Peter D. Düben ◽  
Christophe Denis ◽  
Andrew Dawson ◽  
Peter Ashwin

Abstract We test the impact of changing numerical precision upon forecasts using the chaotic Lorenz’95 system. We find that in comparison with discretization and numerical rounding errors, the dominant source of errors are the initial condition errors. These initial condition errors introduced into the Lorenz’95 system grow exponentially at a rate according to the leading Lyapunov exponent. Given this information we show that the number of bits necessary to represent the system state can be reduced linearly in time without significantly affecting forecast skill. This is in addition to any initial reduction in precision to that of the initial conditions and also implies the potential to reduce some storage costs. An approach to vary precision locally within simulations, guided by the direction of eigenvectors of the growth and decay of forecast error (the “singular vectors”), did not show a satisfying impact upon forecast skill in relation to cost savings that could be achieved with a uniform reduction of precision. The error in a selection of ECMWF forecasts as a function of the number of bits used to store them indicates that precision might also be reduced in operational systems.


2017 ◽  
Vol 145 (7) ◽  
pp. 2479-2485 ◽  
Author(s):  
Thomas M. Hamill

A global reforecast dataset was recently created for the National Centers for Environmental Prediction’s Global Ensemble Forecast System (GEFS). This reforecast dataset consists of retrospective and real-time ensemble forecasts produced for the GEFS from 1985 to present day. An 11-member ensemble was produced once daily to +15-day lead time from 0000 UTC initial conditions. While the forecast model was stable during the production of this dataset, in 2011 and several times thereafter, there were significant changes to the forecast model that was used in the data assimilation system itself, as well as changes to the assimilation system and the observations that were assimilated. These changes resulted in substantial changes in the statistical characteristics of the reforecast dataset. Such changes make it challenging to uncritically use reforecasts for statistical postprocessing, which commonly assume that forecast error and bias are approximately consistent from one year to the next. Ensuring the consistency in the statistical characteristics of past and present initial conditions is desirable but can be in tension with the expectation that prediction centers upgrade their forecast systems rapidly.


Author(s):  
Chin-Hung Chen ◽  
Kao-Shen Chung ◽  
Shu-Chih Yang ◽  
Li-Hsin Chen ◽  
Pay-Liam Lin ◽  
...  

AbstractA mesoscale convective system that occurred in southwestern Taiwan on 15 June 2008 is simulated using convection-allowing ensemble forecasts to investigate the forecast uncertainty associated with four microphysics schemes—the Goddard Cumulus Ensemble (GCE), Morrison (MOR), WRF single-moment 6-class (WSM6), and WRF double-moment 6-class (WDM6) schemes. First, the essential features of the convective structure, hydrometeor distribution, and microphysical tendencies for the different microphysics schemes are presented through deterministic forecasts. Second, ensemble forecasts with the same initial conditions are employed to estimate the forecast uncertainty produced by the different ensembles with the fixed microphysics scheme. GCE has the largest spread in most state variables due to its most efficient phase conversion between water species. By contrast, MOR results in the least spread. WSM6 and WDM6 have similar vertical spread structures due to their similar ice-phase formulae. However, WDM6 produces more ensemble spread than WSM6 does below the melting layer, resulting from its double-moment treatment of warm rain processes. The model simulations with the four microphysics schemes demonstrate upscale error growth through spectrum analysis of the root-mean difference total energy (RMDTE). The RMDTE results reveal that the GCE and WDM6 schemes are more sensitive to initial condition uncertainty, whereas the MOR and WSM6 schemes are relatively less sensitive to that for this event. Overall, the diabatic heating–cooling processes connect the convective-scale cloud microphysical processes to the large-scale dynamical and thermodynamical fields, and they significantly affect the forecast error signatures in the multiscale weather system.


Sign in / Sign up

Export Citation Format

Share Document