scholarly journals Improved Estimates of the European Winter Windstorm Climate and the Risk of Reinsurance Loss Using Climate Model Data

2010 ◽  
Vol 49 (10) ◽  
pp. 2092-2120 ◽  
Author(s):  
Paul M. Della-Marta ◽  
Mark A. Liniger ◽  
Christof Appenzeller ◽  
David N. Bresch ◽  
Pamela Köllner-Heck ◽  
...  

Abstract Current estimates of the European windstorm climate and their associated losses are often hampered by either relatively short, coarse resolution or inhomogeneous datasets. This study tries to overcome some of these shortcomings by estimating the European windstorm climate using dynamical seasonal-to-decadal (s2d) climate forecasts from the European Centre for Medium-Range Weather Forecasts (ECMWF). The current s2d models have limited predictive skill of European storminess, making the ensemble forecasts ergodic samples on which to build pseudoclimates of 310–396 yr in length. Extended winter (October–April) windstorm climatologies are created using scalar extreme wind indices considering only data above a high threshold. The method identifies up to 2363 windstorms in s2d data and up to 380 windstorms in the 40-yr ECMWF Re-Analysis (ERA-40). Classical extreme value analysis (EVA) techniques are used to determine the windstorm climatologies. Differences between the ERA-40 and s2d windstorm climatologies require the application of calibration techniques to result in meaningful comparisons. Using a combined dynamical–statistical sampling technique, the largest influence on ERA-40 return period (RP) uncertainties is the sampling variability associated with only 45 seasons of storms. However, both maximum likelihood (ML) and L-moments (LM) methods of fitting a generalized Pareto distribution result in biased parameters and biased RP at sample sizes typically obtained from 45 seasons of reanalysis data. The authors correct the bias in the ML and LM methods and find that the ML-based ERA-40 climatology overestimates the RP of windstorms with RPs between 10 and 300 yr and underestimates the RP of windstorms with RPs greater than 300 yr. A 50-yr event in ERA-40 is approximately a 40-yr event after bias correction. Biases in the LM method result in higher RPs after bias correction although they are small when compared with those of the ML method. The climatologies are linked to the Swiss Reinsurance Company (Swiss Re) European windstorm loss model. New estimates of the risk of loss are compared with those from historical and stochastically generated windstorm fields used by Swiss Re. The resulting loss-frequency relationship matches well with the two independently modeled estimates and clearly demonstrates the added value by using alternative data and methods, as proposed in this study, to estimate the RP of high RP losses.

2015 ◽  
Vol 28 (17) ◽  
pp. 6938-6959 ◽  
Author(s):  
Alex J. Cannon ◽  
Stephen R. Sobie ◽  
Trevor Q. Murdock

Abstract Quantile mapping bias correction algorithms are commonly used to correct systematic distributional biases in precipitation outputs from climate models. Although they are effective at removing historical biases relative to observations, it has been found that quantile mapping can artificially corrupt future model-projected trends. Previous studies on the modification of precipitation trends by quantile mapping have focused on mean quantities, with less attention paid to extremes. This article investigates the extent to which quantile mapping algorithms modify global climate model (GCM) trends in mean precipitation and precipitation extremes indices. First, a bias correction algorithm, quantile delta mapping (QDM), that explicitly preserves relative changes in precipitation quantiles is presented. QDM is compared on synthetic data with detrended quantile mapping (DQM), which is designed to preserve trends in the mean, and with standard quantile mapping (QM). Next, methods are applied to phase 5 of the Coupled Model Intercomparison Project (CMIP5) daily precipitation projections over Canada. Performance is assessed based on precipitation extremes indices and results from a generalized extreme value analysis applied to annual precipitation maxima. QM can inflate the magnitude of relative trends in precipitation extremes with respect to the raw GCM, often substantially, as compared to DQM and especially QDM. The degree of corruption in the GCM trends by QM is particularly large for changes in long period return values. By the 2080s, relative changes in excess of +500% with respect to historical conditions are noted at some locations for 20-yr return values, with maximum changes by DQM and QDM nearing +240% and +140%, respectively, whereas raw GCM changes are never projected to exceed +120%.


2021 ◽  
Author(s):  
Erika Médus ◽  
Emma Dybro Thomassen ◽  
Danijel Belušić ◽  
Petter Lind ◽  
Peter Berg ◽  
...  

Abstract. It is well established that using km scale grid resolution for simulations of weather systems in weather and climate models enhances their realism. This study explores heavy and extreme precipitation characteristics over the Nordic region generated by the regional climate model, HARMONIE-Climate (HCLIM). Two model setups of HCLIM are used: ERA-Interim driven HCLIM12 covering Europe at 12 km resolution with parameterized convection and HCLIM3 covering the Nordic region with 3 km resolution and explicit deep convection. The HCLIM simulations are evaluated against several gridded and in situ observation datasets for the warm season from April to September regarding their ability to reproduce sub-daily and daily heavy precipitation statistics across the Nordic region. Both model setups are able to capture the daily heavy precipitation characteristics in the analyzed region. At sub-daily scale, HCLIM3 clearly improves the statistics of occurrence of the most intense heavy precipitation events, as well as the timing and amplitude of the diurnal cycle of these events compared to its forcing HCLIM12. Extreme value analysis shows that HCLIM3 provides added value in capturing sub-daily return levels compared to HCLIM12, which fails to produce the most extreme events. The results indicate clear benefits of the convection-permitting model in simulating heavy and extreme precipitation in the present-day climate, therefore, offering a motivating way forward to investigate the climate change impacts in the region.


2020 ◽  
Vol 8 (12) ◽  
pp. 1015
Author(s):  
Alicia Takbash ◽  
Ian R. Young

A non-stationary extreme value analysis of 41 years (1979–2019) of global ERA5 (European Centre for Medium-Range Weather Forecasts Reanalysis) significant wave height data is undertaken to investigate trends in the values of 100-year significant wave height, Hs100. The analysis shows that there has been a statistically significant increase in the value of Hs100 over large regions of the Southern Hemisphere. There have also been smaller decreases in Hs100 in the Northern Hemisphere, although the related trends are generally not statistically significant. The increases in the Southern Hemisphere are a result of an increase in either the frequency or intensity of winter storms, particularly in the Southern Ocean.


2016 ◽  
Vol 50 (1) ◽  
pp. 88-98 ◽  
Author(s):  
Pentapati Satyavathi ◽  
Makarand C. Deo ◽  
Jyoti Kerkar ◽  
Ponnumony Vethamony

AbstractKnowledge of design waves with long return periods forms an essential input to many engineering applications, including structural design and analysis. Such extreme or long-term waves are conventionally evaluated using observed or hindcast historical wave data. Globally, waves are expected to undergo future changes in magnitude and behavior as a result of climate change induced by global warming. Considering future climate change, this study attempts to reevaluate significant wave height (Hs) as well as average spectral wave period (Tz) with a return period of 100 years for a series of locations along the western Indian coastline. Historical waves are simulated using a numerical wave model forced by wind data extracted from the archives of the National Center for Environmental Prediction and the National Center for Atmospheric Research, while future wave data are generated by a state-of-the-art Canadian general circulation model. A statistical extreme value analysis of past and projected wave data carried out with the help of the generalized Pareto distribution showed an increase in 100-year Hs and Tz along the Indian coastline, pointing out the necessity to reconsider the safety of offshore structures in the light of global warming.


2009 ◽  
Vol 6 (4) ◽  
pp. 5377-5413 ◽  
Author(s):  
W. Terink ◽  
R. T. W. L. Hurkmans ◽  
P. J. J. F. Torfs ◽  
R. Uijlenhoet

Abstract. In many climate impact studies hydrological models are forced with meteorological forcing data without an attempt to assess the quality of these forcing data. The objective of this study is to compare downscaled ERA15 (ECMWF-reanalysis data) precipitation and temperature with observed precipitation and temperature and apply a bias correction to these forcing variables. The bias-corrected precipitation and temperature data will be used in another study as input for the Variable Infiltration Capacity (VIC) model. Observations were available for 134 sub-basins throughout the Rhine basin at a temporal resolution of one day from the International Commission for the Hydrology of the Rhine basin (CHR). Precipitation is corrected by fitting the mean and coefficient of variation (CV) of the observations. Temperature is corrected by fitting the mean and standard deviation of the observations. It seems that the uncorrected ERA15 is too warm and too wet for most of the Rhine basin. The bias correction leads to satisfactory results, precipitation and temperature differences decreased significantly. Corrections were largest during summer for both precipitation and temperature, and for September and October for precipitation only. Besides the statistics the correction method was intended to correct for, it is also found to improve the correlations for the fraction of wet days and lag-1 autocorrelations between ERA15 and the observations.


2021 ◽  
Author(s):  
Anne Dutfoy ◽  
Gloria Senfaute

Abstract Probabilistic Seismic Hazard Analysis (PSHA) procedures require that at least the mean activity rate be known, as well as the distribution of magnitudes. Within the Gutenberg-Richter assumption, that distribution is an Exponential distribution, upperly truncated to a maximum possible magnitude denoted $m_{max}$. This parameter is often fixed from expert judgement under tectonics considerations, due to a lack of universal method. In this paper, we propose two innovative alternatives to the Gutenberg-Richter model, based on the Extreme Value Theory and that don't require to fix a priori the value of $m_{max}$: the first one models the tail distribution magnitudes with a Generalized Pareto Distribution; the second one is a variation on the usual Gutenberg-Richter model where $m_{max}$ is a random variable that follows a distribution defined from an extreme value analysis. We use the maximum likelihood estimators taking into account the unequal observation spans depending on magnitude, the incompleteness threshold of the catalog and the uncertainty in the magnitude value itself. We apply these new recurrence models on the data observed in the Alps region, in the south of France and we integrate them into a probabilistic seismic hazard calculation to evaluate their impact on the seismic hazard levels. The proposed new recurrence models introduce a reduction of the seismic hazard level compared to the common Gutenberg-Richter model conventionally used for PSHA calculations. This decrease is significant for all frequencies below 10 Hz, mainly at the lowest frequencies and for very long return periods. To our knowledge, both new models have never been used in a probabilistic seismic hazard calculation and constitute a new promising generation of recurrence models.


2020 ◽  
Author(s):  
Frank Kwasniok

<p>Traditional extreme value analysis based on the generalised ex-<br>treme value (GEV) or generalised Pareto distribution (GPD) suffers<br>from two drawbacks: (i) Both methods are wasteful of data as only<br>block maxima or exceedances over a high threshold are taken into ac-<br>count and the bulk of the data is disregarded. (ii) Moreover, in the<br>GPD approach, there is no systematic way to determine the threshold<br>parameter. Here, all the data are fitted simultaneously using a gener-<br>alised exponential family model for the bulk and a GPD model for the<br>tail. At the threshold, the two distributions are linked together with<br>appropriate matching conditions. The model parameters are estimated<br>from the likelihood function of all the data. Also the threshold param-<br>eter can be determined via maximum likelihood in an outer loop. The<br>method is exemplified on wind speed data from an atmospheric model.</p>


2019 ◽  
Vol 44 (4) ◽  
pp. 341-360
Author(s):  
Ignacio Franco ◽  
Alejandro Gutierrez ◽  
José Cataldo

The aim of this study was to generate hourly mean monthly maximum wind speed return period curves for heights 60 m above the ground using Weather Research and Forecasting modeled data. The methodology introduced produces long-term wind speed data in places where the available measured series for such heights are not long enough for a correct extreme value analysis. Climate Forecast System Reanalysis data are used as input for the Weather Research and Forecasting simulations, providing information for the period 1979–2015. The modeled results are compared with wind speed series measured in anemometric towers, available for the period 2008–2015. Weather Research and Forecasting output is then adjusted to model properly the wind speed at the measuring sites. A good representation of the cumulative distribution of monthly maxima was reached after applying a double linear adjustment.


2018 ◽  
Vol 31 (21) ◽  
pp. 8819-8842 ◽  
Author(s):  
Alberto Meucci ◽  
Ian R. Young ◽  
Øyvind Breivik

The present work develops an innovative approach to wind speed and significant wave height extreme value analysis. The approach is based on global atmosphere–wave model ensembles, the members of which are propagated in time from the best estimate of the initial state, with slight perturbations to the initial conditions, to estimate the uncertainties connected to model representations of reality. The low correlation of individual ensemble member forecasts at advanced lead times guarantees their independence and allows us to perform inference statistics. The advantage of ensemble probabilistic forecasts is that it is possible to synthesize an equivalent dataset of duration far longer than the simulation period. This allows the use of direct inference statistics to obtain extreme value estimates. A short time series of six years (from 2010 to 2016) of ensemble forecasts is selected to avoid major changes to the model physics and resolution and thus ensure stationarity. This time series is used to undertake extreme value analysis. The study estimates global wind speed and wave height return periods by selecting peaks from ensemble forecasts from +216- to +240-h lead time from the operational ensemble forecast dataset of the European Centre for Medium-Range Weather Forecasts (ECMWF). The results are compared with extreme value analyses performed on a commonly used reanalysis dataset, ERA-Interim, and buoy data. The comparison with traditional methods demonstrates the potential of this novel approach for statistical analysis of significant wave height and wind speed ocean extremes at the global scale.


Sign in / Sign up

Export Citation Format

Share Document