A Statistical Method for Associating Earthquakes with Their Source Faults in Southern California

2020 ◽  
Vol 110 (1) ◽  
pp. 213-225
Author(s):  
Walker S. Evans ◽  
Andreas Plesch ◽  
John H. Shaw ◽  
Natesh L. Pillai ◽  
Ellen Yu ◽  
...  

ABSTRACT We present a new statistical method for associating earthquakes with their source faults in the Southern California Earthquake Center’s 3D Community Fault Models (CFMs; Plesch et al., 2007) in near-real time and for historical earthquakes. The method uses the hypocenter location, focal mechanism orientation, and earthquake sequencing to produce the probabilities of association between a given earthquake and each fault in the CFM as well as the probability that the event occurred on a fault not represented in the CFM. We used a set of known likely associations (the Known Likely Sets) as training or testing data and demonstrated that our models perform effectively on these examples and should be expected to perform well on other earthquakes with similar characteristics including the full catalog of southern California earthquakes (Hauksson et al., 2012). To produce near-real-time associations for future earthquakes, the models have been implemented as an R script and connected to the Southern California Seismic Network data processing system operated by the California Institute of Technology and the U.S. Geological Survey to automatically produce fault associations for earthquakes of M≥3.0 as they occur. To produce historical associations, we apply the method to the most recent CFM version (v.5.2), yielding modeled historical associations for all events of M≥3.0 in the catalog of southern California earthquakes from 1981 to 2016. More than 80% of these events and 99% of moment within the geography covered by the CFM had a primary association with a CFM fault. The models can help identify clusters of small earthquakes that indicate the onset of activity associated with major faults. The method will also assist in communicating objective information about the faults that source earthquakes to the scientific community and general public. In the event of a damaging southern California earthquake, the near-real-time association will provide valuable information regarding the similarity of the current event to forecast scenarios, potentially aiding in earthquake response.

2019 ◽  
Vol 109 (4) ◽  
pp. 1563-1570 ◽  
Author(s):  
Zefeng Li ◽  
Egill Hauksson ◽  
Jennifer Andrews

Abstract Modern seismic networks commonly equip a station with multiple sensors, to extend the frequency band and the dynamic range of data recorded at the station. In addition, in our recent study we showed that comparison of data from co‐located seismometers and accelerometers is useful for detecting instrument malfunctions and monitoring data quality. In this study, we extend comparison of data from different co‐located sensors to two other applications: (1) amplitude calibration for data from vertical short‐period sensors with strong‐motion sensors as baseline and (2) measurement of orientation discrepancy between strong‐motion and broadband sensors. We perform systematic analyses of data recorded by the California Institute of Technology/U.S. Geological Survey Southern California Seismic Network. In the first application, we compare the amplitude of data from vertical short‐period sensors to that of data from co‐located strong‐motion sensors and measure the amplitude calibration factors for 93 short‐period sensors. Among them, 49 stations are measured at ∼1.0, 42 measured at ∼0.6, as well as two outlying stations: GFF at 0.3 and CHI at 1.3. These values are found to be related to the sensors’ sensitivity values. In the second application, we measure orientation discrepancy between 222 co‐located broadband and strong‐motion sensors. All the vertical orientation differences are found to be within 5°. However, the horizontal orientation differences of 22 stations are greater than 6°, among which four stations have reverse rotation or 180° from the expected orientation. These measurements have been communicated to network operators and fixes are being applied. This study, together with our previously developed data monitoring framework, demonstrates that comparison of different co‐located sensors is a simple and effective tool for a broad range of seismic data assessment and instrument calibration.


2018 ◽  
Vol 57 (6) ◽  
pp. 1337-1352 ◽  
Author(s):  
Changhyoun Park ◽  
Christoph Gerbig ◽  
Sally Newman ◽  
Ravan Ahmadov ◽  
Sha Feng ◽  
...  

AbstractTo study regional-scale carbon dioxide (CO2) transport, temporal variability, and budget over the Southern California Air Basin (SoCAB) during the California Research at the Nexus of Air Quality and Climate Change (CalNex) 2010 campaign period, a model that couples the Weather Research and Forecasting (WRF) Model with the Vegetation Photosynthesis and Respiration Model (VPRM) has been used. Our numerical simulations use anthropogenic CO2 emissions of the Hestia Project 2010 fossil-fuel CO2 emissions data products along with optimized VPRM parameters at “FLUXNET” sites, for biospheric CO2 fluxes over SoCAB. The simulated meteorological conditions have been validated with ground and aircraft observations, as well as with background CO2 concentrations from the coastal Palos Verdes site. The model captures the temporal pattern of CO2 concentrations at the ground site at the California Institute of Technology in Pasadena, but it overestimates the magnitude in early daytime. Analysis of CO2 by wind directions reveals the overestimate is due to advection from the south and southwest, where downtown Los Angeles is located. The model also captures the vertical profile of CO2 concentrations along with the flight tracks. The optimized VPRM parameters have significantly improved simulated net ecosystem exchange at each vegetation-class site and thus the regional CO2 budget. The total biospheric contribution ranges approximately from −24% to −20% (daytime) of the total anthropogenic CO2 emissions during the study period.


2014 ◽  
Vol 96 (4) ◽  
pp. 373-404
Author(s):  
Hunter Hollins

While spectator interest got aircraft off the ground, scientific inquiry initially fueled advances in design. But from a very early date, military application was a driving force. The histories of aircraft, the California Institute of Technology (Caltech), and Jet Propulsion Laboratory (JPL) bring to light the relative roles of science and military in the development of aerospace in Southern California.


Author(s):  
William F. Chambers ◽  
Arthur A. Chodos ◽  
Roland C. Hagan

TASK8 was designed as an electron microprobe control program with maximum flexibility and versatility, lending itself to a wide variety of applications. While using TASKS in the microprobe laboratory of the Los Alamos National Laboratory, we decided to incorporate the capability of using subroutines which perform specific end-member calculations for nearly any type of mineral phase that might be analyzed in the laboratory. This procedure minimizes the need for post-processing of the data to perform such calculations as element ratios or end-member or formula proportions. It also allows real time assessment of each data point.The use of unique “mineral codes” to specify the list of elements to be measured and the type of calculation to perform on the results was first used in the microprobe laboratory at the California Institute of Technology to optimize the analysis of mineral phases. This approach was used to create a series of subroutines in TASK8 which are called by a three letter code.


2021 ◽  
pp. 100489
Author(s):  
Paul La Plante ◽  
P.K.G. Williams ◽  
M. Kolopanis ◽  
J.S. Dillon ◽  
A.P. Beardsley ◽  
...  

2002 ◽  
Author(s):  
Wei Liu ◽  
Zeying Chi ◽  
Wenjian Chen

Energies ◽  
2021 ◽  
Vol 14 (11) ◽  
pp. 3322
Author(s):  
Sara Alonso ◽  
Jesús Lázaro ◽  
Jaime Jiménez ◽  
Unai Bidarte ◽  
Leire Muguira

Smart grid endpoints need to use two environments within a processing system (PS), one with a Linux-type operating system (OS) using the Arm Cortex-A53 cores for management tasks, and the other with a standalone execution or a real-time OS using the Arm Cortex-R5 cores. The Xen hypervisor and the OpenAMP framework allow this, but they may introduce a delay in the system, and some messages in the smart grid need a latency lower than 3 ms. In this paper, the Linux thread latencies are characterized by the Cyclictest tool. It is shown that when Xen hypervisor is used, this scenario is not suitable for the smart grid as it does not meet the 3 ms timing constraint. Then, standalone execution as the real-time part is evaluated, measuring the delay to handle an interrupt created in programmable logic (PL). The standalone application was run in A53 and R5 cores, with Xen hypervisor and OpenAMP framework. These scenarios all met the 3 ms constraint. The main contribution of the present work is the detailed characterization of each real-time execution, in order to facilitate selecting the most suitable one for each application.


Sign in / Sign up

Export Citation Format

Share Document