scholarly journals Gutenberg–Richter B-Value Time Series Forecasting: A Weighted Likelihood Approach

Forecasting ◽  
2021 ◽  
Vol 3 (3) ◽  
pp. 561-569
Author(s):  
Matteo Taroni ◽  
Giorgio Vocalelli ◽  
Andrea De Polis

We introduce a novel approach to estimate the temporal variation of the b-value parameter of the Gutenberg–Richter law, based on the weighted likelihood approach. This methodology allows estimating the b-value based on the full history of the available data, within a data-driven setting. We test this methodology against the classical “rolling window” approach using a high-definition Italian seismic catalogue as well as a global catalogue of high magnitudes. The weighted likelihood approach outperforms competing methods, and measures the optimal amount of past information relevant to the estimation.

2019 ◽  
Author(s):  
Giulio Caravagna ◽  
Timon Heide ◽  
Marc Williams ◽  
Luis Zapata ◽  
Daniel Nichol ◽  
...  

AbstractThe vast majority of cancer next-generation sequencing data consist of bulk samples composed of mixtures of cancer and normal cells. To study tumor evolution, subclonal reconstruction approaches based on machine learning are used to separate subpopulation of cancer cells and reconstruct their ancestral relationships. However, current approaches are entirely data-driven and agnostic to evolutionary theory. We demonstrate that systematic errors occur in subclonal reconstruction if tumor evolution is not accounted for, and that those errors increase when multiple samples are taken from the same tumor. To address this issue, we present a novel approach for model-based subclonal reconstruction that combines data-driven machine learning with evolutionary theory. Using public, synthetic and newly generated data, we show the method is more robust and accurate than current techniques in both single-sample and multi-region sequencing data. With careful data curation and interpretation, we show how the method allows minimizing the confounding factors that affect non-evolutionary methods, leading to a more accurate recovery of the evolutionary history of human tumors.


Author(s):  
Matteo Taroni ◽  
Jiancang Zhuang ◽  
Warner Marzocchi

Abstract The spatial variability of the magnitude–frequency distribution is important to improve earthquake forecasting capabilities at different time scales. Here, we develop a novel approach, based on the weighted maximum-likelihood estimation, to build a spatial model for the b-value parameter of the Gutenberg–Richter law and its uncertainty, also for earthquake catalogs with a time-varying completeness magnitude. Then, we also provide a guideline based on the Bayes factor to measure the importance of the b-value spatial variability with respect to a model having a spatially uniform b-value. Finally, we apply the procedure to a new Italian instrumental earthquake catalog from 1960 to 2019 to investigate the b-value spatial variability over the Italian territory.


Author(s):  
Matteo Taroni ◽  
Jiancang Zhuang ◽  
Warner Marzocchi

Abstract Taroni et al. (2021; hereafter TZM21) proposed a method to perform a spatial b-value mapping based on the weighted-likelihood estimation and applied this method to the Italian region as a tutorial example. In the accompanying comment, Gulia et al. (2021; hereafter GGW21) did not challenge the TZM21’s method, but they argued that the catalog used by TZM21 is contaminated by quarry blasts, introducing a bias that may impact any seismotectonic or hazard interpretations. Although in TZM21 the application to the Italian territory was only a tutorial example and we purposely did not make any thorough discussion on the meaning of the results in terms of seismotectonic or seismic hazards (that would have required many more analyses), we acknowledge the potential role of the quarry blasts, and we add some further analysis here. We thank GGW21 for giving us this opportunity. Here, removing the part of the catalog contaminated by quarry blasts and applying the same analysis as in TZM21, we obtain results that are very similar to the ones reported in TZM21; specifically, only one region that is characterized by low natural seismicity rate shows a marked effect of the quarry blasts on the b-value.


2021 ◽  
Vol 13 (3) ◽  
pp. 63
Author(s):  
Maghsoud Morshedi ◽  
Josef Noll

Video conferencing services based on web real-time communication (WebRTC) protocol are growing in popularity among Internet users as multi-platform solutions enabling interactive communication from anywhere, especially during this pandemic era. Meanwhile, Internet service providers (ISPs) have deployed fiber links and customer premises equipment that operate according to recent 802.11ac/ax standards and promise users the ability to establish uninterrupted video conferencing calls with ultra-high-definition video and audio quality. However, the best-effort nature of 802.11 networks and the high variability of wireless medium conditions hinder users experiencing uninterrupted high-quality video conferencing. This paper presents a novel approach to estimate the perceived quality of service (PQoS) of video conferencing using only 802.11-specific network performance parameters collected from Wi-Fi access points (APs) on customer premises. This study produced datasets comprising 802.11-specific network performance parameters collected from off-the-shelf Wi-Fi APs operating at 802.11g/n/ac/ax standards on both 2.4 and 5 GHz frequency bands to train machine learning algorithms. In this way, we achieved classification accuracies of 92–98% in estimating the level of PQoS of video conferencing services on various Wi-Fi networks. To efficiently troubleshoot wireless issues, we further analyzed the machine learning model to correlate features in the model with the root cause of quality degradation. Thus, ISPs can utilize the approach presented in this study to provide predictable and measurable wireless quality by implementing a non-intrusive quality monitoring approach in the form of edge computing that preserves customers’ privacy while reducing the operational costs of monitoring and data analytics.


2021 ◽  
Vol 12 (1) ◽  
Author(s):  
Jared L. Callaham ◽  
James V. Koch ◽  
Bingni W. Brunton ◽  
J. Nathan Kutz ◽  
Steven L. Brunton

AbstractThroughout the history of science, physics-based modeling has relied on judiciously approximating observed dynamics as a balance between a few dominant processes. However, this traditional approach is mathematically cumbersome and only applies in asymptotic regimes where there is a strict separation of scales in the physics. Here, we automate and generalize this approach to non-asymptotic regimes by introducing the idea of an equation space, in which different local balances appear as distinct subspace clusters. Unsupervised learning can then automatically identify regions where groups of terms may be neglected. We show that our data-driven balance models successfully delineate dominant balance physics in a much richer class of systems. In particular, this approach uncovers key mechanistic models in turbulence, combustion, nonlinear optics, geophysical fluids, and neuroscience.


ZDM ◽  
2021 ◽  
Author(s):  
Gert Schubring

AbstractThe aspiration of this paper is to develop a novel approach towards investigating the socio-political history of mathematics teaching in educational systems. Traditionally, historical studies are confined to just one country, the author’s country. Broader approaches address international developments by confronting and comparing global and local aspects—revealing general patterns and more specific ‘local’ structures and characteristics. Yet, already in antiquity and medieval times, the specific characteristic of mathematics teaching, namely to operate at the crossroads of general education and vocational training, proved to be intimately tied to the functioning of the particular political system. In pre-modern times, however, a truly international pattern emerged for the first time: European powers conquered, occupied and colonised overseas regions. Given that educational systems were emerging at the same time within these states, they often transmitted elements of these structures to their colonies. This phenomenon included mathematics, and the history of its teaching is analysed here as a part of coloniality. It is shown that this was not a uniform process, and the differences between the various colonial powers are discussed. The involvement of mathematics in the process of decolonisation is addressed, as well as its role in the tension between continued coloniality and movements of decoloniality. Finally, the general framework provided for studying socio-political processes connected with establishing mathematics teaching within public educational systems is applied, in order to analyse recent coloniality practices effected by international achievement studies.


Vascular ◽  
2021 ◽  
pp. 170853812110489
Author(s):  
Nathan W Kugler ◽  
Brian D Lewis ◽  
Michael Malinowski

Objectives Axillary pullout syndrome is a complex, potentially fatal complication following axillary-femoral bypass graft creation. The re-operative nature, in addition to ongoing hemorrhage, makes for a complicated and potentially morbid repair. Methods We present the case of a 57-year-old man with history of a previous left axillary-femoral-femoral bypass who presented with acute limb-threatening ischemia as a result of bypass thrombosis managed with a right axillary-femoral bypass for limb salvage. His postoperative course was complicated by an axillary anastomotic dehiscence while recovering in inpatient rehabilitation resulting in acute, life-threatening hemorrhage. He was managed utilizing a novel hybrid approach in which a retrograde stent graft was initially placed across the anastomotic dehiscence for control of hemorrhage. He then underwent exploration, decompression, and interposition graft repair utilizing the newly placed stent graft to reinforce the redo axillary anastomosis. Results and Conclusion Compared with a traditional operative approach, the hybrid endovascular and open approach limited ongoing hemorrhage while providing a more stable platform for repair and graft revascularization. A hybrid approach to the management of axillary pullout syndrome provides a safe, effective means to the management of axillary anastomotic dehiscence while minimizing the morbidity of ongoing hemorrhage.


2018 ◽  
Vol 48 (5) ◽  
pp. 637-647
Author(s):  
Rebecca Lemov

This article traces the rise of “predictive” attitudes to crime prevention. After a brief summary of the current spread of predictive policing based on person-centered and place-centered mathematical models, an episode in the scientific study of future crime is examined. At UCLA between 1969 and 1973, a well-funded “violence center” occasioned great hopes that the quotient of human “dangerousness”—potential violence against other humans—could be quantified and thereby controlled. At the core of the center, under the direction of interrogation expert and psychiatrist Louis Jolyon West, was a project to gather unprecedented amounts of behavioral data and centrally store it to identify emergent crime. Protesters correctly seized on the violence center as a potential site of racially targeted experimentation in psychosurgery and an example of iatrogenic science. Yet the eventual spectacular failure of the center belies an ultimate success: its data-driven vision itself predicted the Philip K. Dick–style PreCrime policing now emerging. The UCLA violence center thus offers an alternative genealogy to predictive policing. This essay is part of a special issue entitled Histories of Data and the Database edited by Soraya de Chadarevian and Theodore M. Porter.


Sign in / Sign up

Export Citation Format

Share Document