scholarly journals Mapping the Hazard of Extreme Rainfall by Peaks over Threshold Extreme Value Analysis and Spatial Regression Techniques

2006 ◽  
Vol 45 (1) ◽  
pp. 108-124 ◽  
Author(s):  
Santiago Beguería ◽  
Sergio M. Vicente-Serrano

Abstract The occurrence of rainfalls of high magnitude constitutes a primary natural hazard in many parts of the world, and the elaboration of maps showing the hazard of extreme rainfalls has great theoretical and practical interest. In this work a procedure based on extreme value analysis and spatial interpolation techniques is described. The result is a probability model in which the distribution parameters vary smoothly in space. This methodology is applied to the middle Ebro Valley (Spain), a climatically complex area with great contrasts because of the relief and exposure to different air masses. The database consists of 43 daily precipitation series from 1950 to 2000. Because rainfall tends to occur highly clustered in time in the area, a declustering process was applied to the data, and the series of daily cluster maxima were used hereinafter. The mean excess plot and error minimizing were used to find an optimum threshold value to retain the highest records (peaks-over-threshold approach), and a Poisson–generalized Pareto model was fitted to the resulting series. The at-site parameter estimates (location, scale, and shape) were regressed upon a set of location and relief variables, enabling the construction of a spatially explicit probability model. The advantages of this method to obtain maps of extreme precipitation hazard are discussed in depth.

2017 ◽  
Vol 21 (10) ◽  
pp. 5385-5399 ◽  
Author(s):  
Edouard Goudenhoofdt ◽  
Laurent Delobbe ◽  
Patrick Willems

Abstract. In Belgium, only rain gauge time series have been used so far to study extreme rainfall at a given location. In this paper, the potential of a 12-year quantitative precipitation estimation (QPE) from a single weather radar is evaluated. For the period 2005–2016, 1 and 24 h rainfall extremes from automatic rain gauges and collocated radar estimates are compared. The peak intensities are fitted to the exponential distribution using regression in Q-Q plots with a threshold rank which minimises the mean squared error. A basic radar product used as reference exhibits unrealistic high extremes and is not suitable for extreme value analysis. For 24 h rainfall extremes, which occur partly in winter, the radar-based QPE needs a bias correction. A few missing events are caused by the wind drift associated with convective cells and strong radar signal attenuation. Differences between radar and gauge rainfall values are caused by spatial and temporal sampling, gauge underestimations and radar errors. Nonetheless the fit to the QPE data is within the confidence interval of the gauge fit, which remains large due to the short study period. A regional frequency analysis for 1 h duration is performed at the locations of four gauges with 1965–2008 records using the spatially independent QPE data in a circle of 20 km. The confidence interval of the radar fit, which is small due to the sample size, contains the gauge fit for the two closest stations from the radar. In Brussels, the radar extremes are significantly higher than the gauge rainfall extremes, but similar to those observed by an automatic gauge during the same period. The extreme statistics exhibit slight variations related to topography. The radar-based extreme value analysis can be extended to other durations.


2020 ◽  
Author(s):  
Nikos Koutsias ◽  
Frank A. Coutelieris

<p>A statistical analysis on the wildfire events, that have taken place in Greece during the period 1985-2007, for the assessment of the extremes has been performed. The total burned area of each fire was considered here as a key variable to express the significance of a given event. The data have been analyzed through the extreme value theory, which has been in general proved a powerful tool for the accurate assessment of the return period of extreme events. Both frequentist and Bayesian approaches have been used for comparison and evaluation purposes. Precisely, the Generalized Extreme Value (GEV) distribution along with Peaks over Threshold (POT) have been compared with the Bayesian Extreme Value modelling. Furthermore, the correlation of the burned area with the potential extreme values for other key parameters (e.g. wind, temperature, humidity, etc.) has been also investigated.</p>


2014 ◽  
Vol 58 (3) ◽  
pp. 193-207 ◽  
Author(s):  
C Photiadou ◽  
MR Jones ◽  
D Keellings ◽  
CF Dewes

Extremes ◽  
2021 ◽  
Author(s):  
Laura Fee Schneider ◽  
Andrea Krajina ◽  
Tatyana Krivobokova

AbstractThreshold selection plays a key role in various aspects of statistical inference of rare events. In this work, two new threshold selection methods are introduced. The first approach measures the fit of the exponential approximation above a threshold and achieves good performance in small samples. The second method smoothly estimates the asymptotic mean squared error of the Hill estimator and performs consistently well over a wide range of processes. Both methods are analyzed theoretically, compared to existing procedures in an extensive simulation study and applied to a dataset of financial losses, where the underlying extreme value index is assumed to vary over time.


2021 ◽  
Author(s):  
Jeremy Rohmer ◽  
Rodrigo Pedreros ◽  
Yann Krien

<p>To estimate return levels of wave heights (Hs) induced by tropical cyclones at the coast, a commonly-used approach is to (1) randomly generate a large number of synthetic cyclone events (typically >1,000); (2) numerically simulate the corresponding Hs over the whole domain of interest; (3) extract the Hs values at the desired location at the coast and (4) perform the local extreme value analysis (EVA) to derive the corresponding return level. Step 2 is however very constraining because it often involves a numerical hydrodynamic simulator that can be prohibitive to run: this might limit the number of results to perform the local EVA (typically to several hundreds). In this communication, we propose a spatial stochastic simulation procedure to increase the database size of numerical results with synthetic maps of Hs that are stochastically generated. To do so, we propose to rely on a data-driven dimensionality-reduction method, either unsupervised (Principal Component Analysis) or supervised (Partial Least Squares Regression), that is trained with a limited number of pre-existing numerically simulated Hs maps. The procedure is applied to the Guadeloupe island and results are compared to the commonly-used approach applied to a large database of Hs values computed for nearly 2,000 synthetic cyclones (representative of 3,200 years – Krien et al., NHESS, 2015). When using only a hundred of cyclones, we show that the estimates of the 100-year return levels can be achieved with a mean absolute percentage error (derived from a bootstrap-based procedure) ranging between 5 and 15% around the coasts while keeping the width of the 95% confidence interval of the same order of magnitude than the one using the full database. Without synthetic Hs maps augmentation, the error and confidence interval width are both increased by nearly 100%. A careful attention is paid to the tuning of the approach by testing the sensitivity to the spatial domain size, the information loss due to data compression, and the number of cyclones. This study has been carried within the Carib-Coast INTERREG project (https://www.interreg-caraibes.fr/carib-coast).</p>


Sign in / Sign up

Export Citation Format

Share Document