Efficient Propagation of Epistemic Uncertainty for Probabilistic Seismic Hazard Analyses (PSHAs) Including Partial Correlation of Magnitude–Distance Scaling

Author(s):  
Maxime Lacour ◽  
Norman Abrahamson

ABSTRACT Probabilistic seismic hazard analysis (PSHA) is moving from ergodic ground-motion models (GMMs) to nonergodic GMMs that account for site-source-specific source, path, and site effects and which require a much larger number of GMM branches on the logic tree to capture the full epistemic uncertainty. An efficient method for computing PSHA with a large number of GMM branches was developed by Lacour and Abrahamson (2019) using polynomial chaos (PC) expansion with the key assumption that the epistemic uncertainty in the median ground motion is fully correlated. In the current study, we remove the assumption of full correlation using a multivariate PC expansion. The correlation structure of the available median GMMs across scenarios is computed empirically. The median ground motion is modeled as a Gaussian random process with the correlation structure of the GMMs across the range of relevant earthquake scenarios. This Gaussian random process is discretized using the Karhunen–Loeve expansion, which leads to multivariate PC expansions of uncertain hazard curves. The hazard fractiles can be reconstructed during an efficient postprocessing phase that includes the effects of partial correlation between the GMMs. Multivariate PC expansions require significantly more terms than for the fully correlated case, which increases the calculation time by about a factor of 5, but it is still much more efficient than direct sampling of the branches of the GMM logic for a large number of branches. An example hazard calculation shows that the effect of using partial correlation in place of full correlation of the GMMs is small for the Next Generation Attenuation-West2 (NGA-West2) set of GMMs, indicating that the fully correlated assumption may be adequate for many applications. The multivariate PC method can be used to evaluate the effects of the partial correlation of the GMMs for sets of GMMs that are different from the NGA-West2 GMMs.

2008 ◽  
Vol 24 (4) ◽  
pp. 997-1009 ◽  
Author(s):  
Julian J. Bommer ◽  
Frank Scherbaum

Logic trees have become a standard feature of probabilistic seismic hazard analyses (PSHA) for determining design ground motions. A logic tree's purpose is to capture and quantify the epistemic uncertainty associated with the inputs to PSHA and thus enable estimation of the resulting uncertainty in the hazard. There are many potential pitfalls in setting up a logic tree for PSHA, mainly related to the fact that in practice, it is questionable that the requirements that the logic-tree branches be both mutually exclusive and collectively exhaustive can actually be met. Careful consideration is also required for making use of the output; in particular, in view of how PSHA is employed in current engineering design practice, it may be more rational to determine the mean ground motion at the selected design return period rather than to find the ground motion at the mean value of this return period.


2020 ◽  
Vol 18 (14) ◽  
pp. 6119-6148
Author(s):  
Graeme Weatherill ◽  
Fabrice Cotton

Abstract Regions of low seismicity present a particular challenge for probabilistic seismic hazard analysis when identifying suitable ground motion models (GMMs) and quantifying their epistemic uncertainty. The 2020 European Seismic Hazard Model adopts a scaled backbone approach to characterise this uncertainty for shallow seismicity in Europe, incorporating region-to-region source and attenuation variability based on European strong motion data. This approach, however, may not be suited to stable cratonic region of northeastern Europe (encompassing Finland, Sweden and the Baltic countries), where exploration of various global geophysical datasets reveals that its crustal properties are distinctly different from the rest of Europe, and are instead more closely represented by those of the Central and Eastern United States. Building upon the suite of models developed by the recent NGA East project, we construct a new scaled backbone ground motion model and calibrate its corresponding epistemic uncertainties. The resulting logic tree is shown to provide comparable hazard outcomes to the epistemic uncertainty modelling strategy adopted for the Eastern United States, despite the different approaches taken. Comparison with previous GMM selections for northeastern Europe, however, highlights key differences in short period accelerations resulting from new assumptions regarding the characteristics of the reference rock and its influence on site amplification.


2015 ◽  
Vol 31 (2) ◽  
pp. 661-698 ◽  
Author(s):  
Julian J. Bommer ◽  
Kevin J. Coppersmith ◽  
Ryan T. Coppersmith ◽  
Kathryn L. Hanson ◽  
Azangi Mangongolo ◽  
...  

A probabilistic seismic hazard analysis has been conducted for a potential nuclear power plant site on the coast of South Africa, a country of low-to-moderate seismicity. The hazard study was conducted as a SSHAC Level 3 process, the first application of this approach outside North America. Extensive geological investigations identified five fault sources with a non-zero probability of being seismogenic. Five area sources were defined for distributed seismicity, the least active being the host zone for which the low recurrence rates for earthquakes were substantiated through investigations of historical seismicity. Empirical ground-motion prediction equations were adjusted to a horizon within the bedrock at the site using kappa values inferred from weak-motion analyses. These adjusted models were then scaled to create new equations capturing the range of epistemic uncertainty in this region with no strong motion recordings. Surface motions were obtained by convolving the bedrock motions with site amplification functions calculated using measured shear-wave velocity profiles.


Author(s):  
Zoya Farajpour ◽  
Milad Kowsari ◽  
Shahram Pezeshk ◽  
Benedikt Halldorsson

ABSTRACT We apply three data-driven selection methods, log-likelihood (LLH), Euclidean distance-based ranking (EDR), and deviance information criterion (DIC), to objectively evaluate the predictive capability of 10 ground-motion models (GMMs) developed from Iranian and worldwide data sets against a new and independent Iranian strong-motion data set. The data set includes, for example, the 12 November 2017 Mw 7.3 Ezgaleh earthquake and the 25 November 2018 Mw 6.3 Sarpol-e Zahab earthquake and includes a total of 201 records from 29 recent events with moment magnitudes 4.5≤Mw≤7.3 with distances up to 275 km. The results of this study show that the prior sigma of the GMMs acts as the key measure used by the LLH and EDR methods in the ranking against the data set. In some cases, this leads to the resulting model bias being ignored. In contrast, the DIC method is free from such ambiguity as it uses the posterior sigma as the basis for the ranking. Thus, the DIC method offers a clear advantage of partially removing the ergodic assumption from the GMM selection process and allows a more objective representation of the expected ground motion at a specific site when the ground-motion recordings are homogeneously distributed in terms of magnitudes and distances. The ranking results thus show that the local models that were exclusively developed from Iranian strong motions perform better than GMMs from other regions for use in probabilistic seismic hazard analysis in Iran. Among the Next Generation Attenuation-West2 models, the GMMs by Boore et al. (2014) and Abrahamson et al. (2014) perform better. The GMMs proposed by Darzi et al. (2019) and Farajpour et al. (2019) fit the recorded data well at short periods (peak ground acceleration and pseudoacceleration spectra at T=0.2  s). However, at long periods, the models developed by Zafarani et al. (2018), Sedaghati and Pezeshk (2017), and Kale et al. (2015) are preferable.


2020 ◽  
Vol 20 (6) ◽  
pp. 1639-1661
Author(s):  
Khalid Mahmood ◽  
Naveed Ahmad ◽  
Usman Khan ◽  
Qaiser Iqbal

Abstract. Probabilistic seismic hazard analysis of Peshawar District has been performed for a grid size of 0.01∘. The seismic sources for the target location are defined as the area polygon with uniform seismicity. The earthquake catalogue was developed based on the earthquake data obtained from different worldwide seismological networks and historical records. The earthquake events obtained at different magnitude scales were converted into moment magnitude using indigenous catalogue-specific regression relationships. The homogenized catalogue was subdivided into shallow crustal and deep-subduction-zone earthquake events. The seismic source parameters were obtained using the bounded Gutenberg–Richter recurrence law. Seismic hazard maps were prepared for peak horizontal acceleration at bedrock level using different ground motion attenuation relationships. The study revealed the selection of an appropriate ground motion prediction equation is crucial for defining the seismic hazard of Peshawar District. The inclusion of deep subduction earthquakes does not add significantly to the seismic hazard for design base ground motions. The seismic hazard map developed for shallow crustal earthquakes, including also the epistemic uncertainty, was in close agreement with the map given in the Building Code of Pakistan Seismic Provisions (2007) for a return period of 475 years on bedrock. The seismic hazard maps for other return periods i.e., 50, 100, 250, 475 and 2500 years, are also presented.


2019 ◽  
Vol 109 (5) ◽  
pp. 2063-2072 ◽  
Author(s):  
Maxime Lacour ◽  
Norman A. Abrahamson

Abstract A computationally efficient methodology for propagating the epistemic uncertainty in the median ground motion in probabilistic seismic hazard analysis is developed using the polynomial chaos (PC) approach. For this application, the epistemic uncertainty in the median ground motion for a specific scenario is assumed to be lognormally distributed and fully correlated across earthquake scenarios. In the hazard calculation, a single central ground‐motion model (GMM) is used for the median along with the epistemic standard error of the median for each scenario. A set of PC coefficients is computed for each scenario and each test ground‐motion level. The additional computation burden in computing these PC coefficients depends on the order of the approximation but is less than computing the median ground motion from one additional GMM. With the PC method, the mean and fractiles of the hazard due to the epistemic uncertainty distribution of the median ground motion are computed as a postprocess that is very fast computationally. For typical values of the standard deviation of epistemic uncertainty in the median ground motion (<0.2 natural log units), the methodology accurately estimates the epistemic uncertainty distribution of the hazard over the 1%–99% range. This full epistemic range is not well modeled with just a small number of GMM branches uses in the traditional logic‐tree approach. The PC method provides more accuracy, faster computation, and reduced memory requirements than the traditional approach. For large values of the epistemic uncertainty in the median ground motion, a higher order of the PC expansion may be needed to be included to capture the full range of the epistemic uncertainty.


2016 ◽  
Vol 32 (3) ◽  
pp. 1405-1418 ◽  
Author(s):  
Mario Ordaz ◽  
Danny Arroyo

Probabilistic seismic hazard analysis (PSHA) is, in essence, a method to deal with uncertainty, the importance of which justifies the use of a formal and rigorous background for its study. Therefore, the purpose of this paper is to contribute to the reflections on how to correctly handle uncertainty in PSHA. We start by studying the simplest case, a Poisson process in which only “aleatory” uncertainty exists; then, we remove the Poisson hypothesis and find expressions for the occurrence probabilities of earthquakes in given time frames for general non-Poisson processes. Later, we include a simple variety of epistemic uncertainty and show that the resulting process is not Poissonian anymore, so computation of probabilities has to be made taking into account this fact. Next, we give a rigorous rule to combine uncertainties of aleatory and epistemic origin, which gives reasonable criteria to decide whether the epistemic uncertainty is large or not. Also, we propose unambiguous guidelines to decide whether a particular class of uncertainty has to be included in the hazard calculations as epistemic or as aleatory. Finally, we discuss the problem of how our estimates could differ if we wrongly considered that our epistemic uncertainty is of aleatory nature, or vice versa.


Sign in / Sign up

Export Citation Format

Share Document