scholarly journals A Transformed Lagged Ensemble Forecasting Technique for Increasing Ensemble Size

2007 ◽  
Vol 135 (4) ◽  
pp. 1424-1438 ◽  
Author(s):  
Andrew R. Lawrence ◽  
James A. Hansen

Abstract An ensemble-based data assimilation approach is used to transform old ensemble forecast perturbations with more recent observations for the purpose of inexpensively increasing ensemble size. The impact of the transformations are propagated forward in time over the ensemble’s forecast period without rerunning any models, and these transformed ensemble forecast perturbations can be combined with the most recent ensemble forecast to sensibly increase forecast ensemble sizes. Because the transform takes place in perturbation space, the transformed perturbations must be centered on the ensemble mean from the most recent forecasts. Thus, the benefit of the approach is in terms of improved ensemble statistics rather than improvements in the mean. Larger ensemble forecasts can be used for numerous purposes, including probabilistic forecasting, targeted observations, and to provide boundary conditions to limited-area models. This transformed lagged ensemble forecasting approach is explored and is shown to give positive results in the context of a simple chaotic model. By incorporating a suitable perturbation inflation factor, the technique was found to generate forecast ensembles whose skill were statistically comparable to those produced by adding nonlinear model integrations. Implications for ensemble forecasts generated by numerical weather prediction models are briefly discussed, including multimodel ensemble forecasting.

2019 ◽  
Vol 26 (3) ◽  
pp. 339-357 ◽  
Author(s):  
Jari-Pekka Nousu ◽  
Matthieu Lafaysse ◽  
Matthieu Vernay ◽  
Joseph Bellier ◽  
Guillaume Evin ◽  
...  

Abstract. Forecasting the height of new snow (HN) is crucial for avalanche hazard forecasting, road viability, ski resort management and tourism attractiveness. Météo-France operates the PEARP-S2M probabilistic forecasting system, including 35 members of the PEARP Numerical Weather Prediction system, where the SAFRAN downscaling tool refines the elevation resolution and the Crocus snowpack model represents the main physical processes in the snowpack. It provides better HN forecasts than direct NWP diagnostics but exhibits significant biases and underdispersion. We applied a statistical post-processing to these ensemble forecasts, based on non-homogeneous regression with a censored shifted Gamma distribution. Observations come from manual measurements of 24 h HN in the French Alps and Pyrenees. The calibration is tested at the station scale and the massif scale (i.e. aggregating different stations over areas of 1000 km2). Compared to the raw forecasts, similar improvements are obtained for both spatial scales. Therefore, the post-processing can be applied at any point of the massifs. Two training datasets are tested: (1) a 22-year homogeneous reforecast for which the NWP model resolution and physical options are identical to the operational system but without the same initial perturbations; (2) 3-year real-time forecasts with a heterogeneous model configuration but the same perturbation methods. The impact of the training dataset depends on lead time and on the evaluation criteria. The long-term reforecast improves the reliability of severe snowfall but leads to overdispersion due to the discrepancy in real-time perturbations. Thus, the development of reliable automatic forecasting products of HN needs long reforecasts as homogeneous as possible with the operational systems.


Author(s):  
Laura Rontu ◽  
Emily Gleeson ◽  
Daniel Martin Perez ◽  
Kristian Pagh Nielsen ◽  
Velle Toll

The direct radiative effect of aerosols is taken into account in many limited area numerical weather prediction models using wavelength-dependent aerosol optical depths of a range of aerosol species. We study the impact of aerosol distribution and optical properties on radiative transfer, based on climatological and more realistic near real-time aerosol data. Sensitivity tests were carried out using the single column version of the ALADIN-HIRLAM numerical weather prediction system, set up to use the HLRADIA broadband radiation scheme. The tests were restricted to clear-sky cases to avoid the complication of cloud-radiation-aerosol interactions. The largest differences in radiative fluxes and heating rates were found to be due to different aerosol loads. When the loads are large, the radiative fluxes and heating rates are sensitive to the aerosol inherent optical properties and vertical distribution of the aerosol species. Impacts of aerosols on shortwave radiation dominate longwave impacts. Sensitivity experiments indicated the important effects of highly absorbing black carbon aerosols and strongly scattering desert dust.


2008 ◽  
Vol 136 (10) ◽  
pp. 3947-3963 ◽  
Author(s):  
Ryan D. Torn ◽  
Gregory J. Hakim

The 2-yr performance of a pseudo-operational (real time) limited-area ensemble Kalman filter (EnKF) based on the Weather Research and Forecasting Model is described. This system assimilates conventional observations from surface stations, rawinsondes, the Aircraft Communications Addressing and Reporting System (ACARS), and cloud motion vectors every 6 h on a domain that includes the eastern North Pacific Ocean and western North America. Ensemble forecasts from this system and deterministic output from operational numerical weather prediction models during this same period are verified against rawinsonde and surface observation data. Relative to operational forecasts, the forecast from the ensemble-mean analysis has slightly larger errors in wind and temperature but smaller errors in moisture, even though satellite radiances are not assimilated by the EnKF. Time-averaged correlations indicate that assimilating ACARS and cloud wind data with flow-dependent error statistics provides corrections to the moisture field in the absence of direct observations of that field. Comparison with a control experiment in which a deterministic forecast is cycled without observation assimilation indicates that the skill in the EnKF’s forecasts results from assimilating observations and not from lateral boundary conditions or the model formulation. Furthermore, the ensemble variance is generally in good agreement with the ensemble-mean error and the spread increases monotonically with forecast hour.


2014 ◽  
Vol 7 (5) ◽  
pp. 6489-6518
Author(s):  
V. Blažica ◽  
N. Gustafsson ◽  
N. Žagar

Abstract. The paper deals with the comparison of the most common periodization methods used to obtain spectral fields of limited-area models for numerical weather prediction. The focus is on the impact the methods have on the spectra of the fields, which are used for verification and tuning of the models. A simplified model is applied with random fields that obey a known kinetic energy spectrum. The periodization methods under consideration are detrending, the discrete cosine transform and the application of an extension zone. For extension zone, three versions are applied: the Boyd method, the ALADIN method and the HIRLAM method. The results show that detrending and the discrete cosine transform have little impact on the spectra, as does the Boyd method for extension zone. For the ALADIN and HIRLAM methods, the impact depends on the width of the extension zone – the wider the zone, the more artificial energy and the larger impact on the spectra. The width of the extension zone correlates to the modifications in the shape of the spectra as well as to the amplitudes of the additional energy in the spectra.


2019 ◽  
Author(s):  
Jari-Pekka Nousu ◽  
Matthieu Lafaysse ◽  
Matthieu Vernay ◽  
Joseph Bellier ◽  
Guillaume Evin ◽  
...  

Abstract. Forecasting the height of new snow (HN) is crucial for avalanche hazard forecasting, roads viability, ski resorts management and tourism attractiveness. Meteo-France operates the PEARP-S2M probabilistic forecasting system including 35 members of the PEARP Numerical Weather Prediction system, where the SAFRAN downscaling tool is refining the elevation resolution, and the Crocus snowpack model is representing the main physical processes in the snowpack. It provides better HN forecasts than direct NWP diagnostics but exhibits significant biases and underdispersion. We applied a statistical post-processing to these ensemble forecasts, based on Nonhomogeneous Regression with a censored shifted Gamma distribution. Observations come from manual measurements of 24-hour HN in French Alps and Pyrenees. The calibration is tested at the station-scale and the massif-scale (i.e. aggregating different stations over areas of 1000 km2). Compared to the raw forecasts, similar improvements are obtained for both spatial scales. Therefore, the post-processing can be applied at any point of the massifs. Two training datasets are tested: (1) a 22-year homogeneous reforecast for which the NWP model resolution and physical options are identical to the operational system but without the same initial perturbations; (2) 3-year real-time forecasts with a heterogeneous model configuration but the same perturbation methods. The impact of the training dataset depends on lead time and on the evaluation criteria. The long-term reforecast improves the reliability of severe snowfall but leads to overdispersion due to the discrepancy in real-time perturbations. Thus, the development of reliable automatic forecasting products of HN needs long reforecasts as homogeneous as possible with the operational systems.


2011 ◽  
Vol 139 (7) ◽  
pp. 2025-2045 ◽  
Author(s):  
Zhiyong Meng ◽  
Fuqing Zhang

Abstract Ensemble-based data assimilation is a state estimation technique that uses short-term ensemble forecasts to estimate flow-dependent background error covariance and is best known by varying forms of ensemble Kalman filters (EnKFs). The EnKF has recently emerged as one of the primary alternatives to the variational data assimilation methods widely used in both global and limited-area numerical weather prediction models. In addition to comparing the EnKF with variational methods, this article reviews recent advances and challenges in the development and applications of the EnKF, including its hybrid with variational methods, in limited-area models that resolve weather systems from convective to meso- and regional scales.


2021 ◽  
Author(s):  
Kameswarrao Modali ◽  
Marc Rautenhaus

<p><span>Ensemble forecasting has become a standard practice in numerical weather prediction in forecasting centres across the world. The large data sets generated by ensemble forecasting systems carry much information, that is difficult to analyse in short time periods, requiring well-designed workflows in order to be useful. </span></p><p><span>Clustering is one of the ensemble analysis methods that are applied to discover similarities between ensemble members. Cluster analysis involves different steps like dimensionality reduction, core clustering algorithm and evaluation. A large of number of methods have been proposed in the literature for each of these steps, however, only few have been applied to clustering of ensemble forecasts. A major challenge is that for a given ensemble forecast, different choices of methods and data domains can lead to very different clustering results. For example, Kumpf et al. (2018, IEEE Transact. Vis. Comp. Graph.) have demonstrated the sensitivity of clustering results to even small changes in the considered domain. The challenge equally exists for choices in clustering methods and method parameters.</span></p><p><span>In our work, we are attempting to open up the clustering black box by introducing a visualization workflow that makes transparent to the user how different choices in methods and method parameters lead to different clustering results. To achieve this, a clustering analysis library that works in tandem with the ensemble visualization software “Met.3D” (</span><span>) is being developed. We present the current state of the system and demonstrate its use by analysing an ensemble forecast case study.</span></p>


2015 ◽  
Vol 8 (1) ◽  
pp. 87-97 ◽  
Author(s):  
V. Blažica ◽  
N. Gustafsson ◽  
N. Žagar

Abstract. The paper deals with the comparison of the most common periodization methods used to obtain spectral fields of limited-area models for numerical weather prediction. The focus is on the impact that the methods have on the spectra of the fields, which are used for verification and tuning of the models. A simplified model is applied with random fields that obey a known kinetic energy spectrum. The periodization methods under consideration are detrending, the discrete cosine transform and the application of an extension zone. For the extension zone, three versions are applied: the Boyd method, the ALADIN method and the HIRLAM method. The results show that detrending and the discrete cosine transform have little impact on the spectra, as does the Boyd method for extension zone. For the ALADIN and HIRLAM methods, the impact depends on the width of the extension zone – the wider the zone, the more artificial energy and the larger impact on the spectra. The width of the extension zone correlates to the modifications in the shape of the spectra as well as to the amplitudes of the additional energy in the spectra.


2021 ◽  
Author(s):  
Daniele Nerini ◽  
Jonas Bhend ◽  
Christoph Spirig ◽  
Lionel Moret ◽  
Mark Liniger

<p>Hourly wind forecasts from numerical weather prediction models suffer from a range of systematic and random errors that are to a great extent related to limitations in the model grid resolution. To correct for such biases, statistical postprocessing and downscaling procedures are commonly applied so to leverage the information provided by automatic wind measurements at the surface. More recently, such techniques have been reformulated in a machine learning framework so to profit from the increased availability of data and computational resources. The results reported in the literature are promising and call for a serious evaluation of their potential for operational forecasting.</p><p>However, there remain several scientific and more applied challenges that need to be addressed before such methods can transition to real-world applications. One such challenge relates to the availability of multiple ensemble forecasts for the same point in time and space, which raises the question of how the information can be efficiently and optimally handled during postprocessing, so to provide added value to the end-user without adding technical debt to the operational system.</p><p>We propose an approach where a single deep learning model is trained to postprocess a combination of three ensemble forecasting systems, namely the high-resolution regional COSMO model with two configurations, and the ECMWF IFS ENS global ensemble forecasting system. We will show how the training is set up to provide a robust postprocessing model that can account for real time scenarios that include missing data and late model runs, while the quality of the forecasts remains comparable to a single-model approach. We found that the flexibility of the deep learning architecture translates into a robust automatic postprocessing solution that limits the maintenance burden and improves the system’s reliability.</p>


2021 ◽  
Vol 21 (16) ◽  
pp. 12273-12290
Author(s):  
Stefan Geiss ◽  
Leonhard Scheck ◽  
Alberto de Lozar ◽  
Martin Weissmann

Abstract. There is a rising interest in improving the representation of clouds in numerical weather prediction models. This will directly lead to improved radiation forecasts and, thus, to better predictions of the increasingly important production of photovoltaic power. Moreover, a more accurate representation of clouds is crucial for assimilating cloud-affected observations, in particular high-resolution observations from instruments on geostationary satellites. These observations can also be used to diagnose systematic errors in the model clouds, which are influenced by multiple parameterisations with many, often not well-constrained, parameters. In this study, the benefits of using both visible and infrared satellite channels for this purpose are demonstrated. We focus on visible and infrared Meteosat SEVIRI (Spinning Enhanced Visible InfraRed Imager) images and their model equivalents computed from the output of the ICON-D2 (ICOsahedral Non-hydrostatic, development version based on version 2.6.1; Zängl et al., 2015) convection-permitting, limited area numerical weather prediction model using efficient forward operators. We analyse systematic deviations between observed and synthetic satellite images derived from semi-free hindcast simulations for a 30 d summer period with strong convection. Both visible and infrared satellite observations reveal significant deviations between the observations and model equivalents. The combination of infrared brightness temperature and visible reflectance facilitates the attribution of individual deviations to specific model shortcomings. Furthermore, we investigate the sensitivity of model-derived visible and infrared observation equivalents to modified model and visible forward operator settings to identify dominant error sources. Estimates of the uncertainty of the visible forward operator turned out to be sufficiently low; thus, it can be used to assess the impact of model modifications. Results obtained for various changes in the model settings reveal that model assumptions on subgrid-scale water clouds are the primary source of systematic deviations in the visible satellite images. Visible observations are, therefore, well-suited to constrain subgrid cloud settings. In contrast, infrared channels are much less sensitive to the subgrid clouds, but they can provide information on errors in the cloud-top height.


Sign in / Sign up

Export Citation Format

Share Document