changepoint analysis
Recently Published Documents


TOTAL DOCUMENTS

54
(FIVE YEARS 9)

H-INDEX

12
(FIVE YEARS 0)

2021 ◽  
pp. 1-8
Author(s):  
Abhishek Ghosh ◽  
Adam Bisaga ◽  
Simranjit Kaur ◽  
Tathagata Mahintamani

<b><i>Introduction:</i></b> There is a need to strengthen the standard surveillance of the opioid overdose crisis in the USA. The role of Google Trends (GT) was explored in this context. <b><i>Methods:</i></b> In this study, a systemic GT search was done for a period from January 2004 to December 2018. “Naloxone” and “drug overdose” were chosen as search inputs. By using locally weighted scatterplot smoothing, we locally regressed and smoothed the relative search data generated by the GT search. We conducted a changepoint analysis (CPA) to detect significant statistical changes in the “naloxone” trend from 2004 to 2018. Cross-correlation function analyses were done to examine the correlation between 2 time series: year-wise relative search volume (RSV) for “naloxone” and “drug overdose” with the age-adjusted drug overdose mortality rate. Pearson’s correlation was performed for the state-wise age-adjusted mortality rate due to drug overdose and RSV for “naloxone” and “drug overdose.” <b><i>Results:</i></b> Smoothed and regressed GT of “naloxone” were similar to the “opioid overdose” trend published by the National Center for Health Statistics. The CPA showed 2 statistically significant points in 2011 and 2015. CPA of year-wise RSV for “naloxone” and “drug overdose” showed significantly positive correlation with the age-adjusted drug overdose mortality at lag zero. State-wise RSV for “naloxone” and “drug overdose” too showed a strong and significant positive correlation with the state-wise mortality data. <b><i>Discussion/Conclusion:</i></b> Inexpensive, publicly accessible, real-time GT data could supplement and strengthen the monitoring of opioid overdose epidemic if used in conjunction with the existing official data sources.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Christian Möllmann ◽  
Xochitl Cormon ◽  
Steffen Funk ◽  
Saskia A. Otto ◽  
Jörn O. Schmidt ◽  
...  

AbstractUnderstanding tipping point dynamics in harvested ecosystems is of crucial importance for sustainable resource management because ignoring their existence imperils social-ecological systems that depend on them. Fisheries collapses provide the best known examples for realizing tipping points with catastrophic ecological, economic and social consequences. However, present-day fisheries management systems still largely ignore the potential of their resources to exhibit such abrupt changes towards irreversible low productive states. Using a combination of statistical changepoint analysis and stochastic cusp modelling, here we show that Western Baltic cod is beyond such a tipping point caused by unsustainable exploitation levels that failed to account for changing environmental conditions. Furthermore, climate change stabilizes a novel and likely irreversible low productivity state of this fish stock that is not adapted to a fast warming environment. We hence argue that ignorance of non-linear resource dynamics has caused the demise of an economically and culturally important social-ecological system which calls for better adaptation of fisheries systems to climate change.


Author(s):  
Jean-Robert Grasso ◽  
Daniel Amorese ◽  
Abror Karimov

ABSTRACT The activation of tectonics and anthropogenic swarms in time and space and size remains challenging for seismologists. One remarkably long swarm is the Lacq swarm. It has been ongoing since 1969 and is located in a compound oil–gas field with a complex fluid manipulation history. Based on the overlap between the volumes where poroelastic model predicts stresses buildup and those where earthquakes occur, gas reservoir depletion was proposed to control the Lacq seismic swarm. The 2016 Mw 3.9, the largest event on the site, is located within a few kilometers downward the deep injection well. It questions the possible interactions between the 1955–2016 wastewater injections and the Lacq seismicity. Revisiting 60 yr of fluid manipulation history and seismicity indicates that the impacts of the wastewater injections on the Lacq seismicity were previously underevaluated. The main lines of evidence toward a wastewater injection cause are (1) cumulative injected volume enough in 1969 to trigger Mw 3 events, onset of Lacq seismicity; (2) 1976 injection below the gas reservoir occurs only a few years before the sharp increase in seismicity. It matches the onset of deep seismicity (below the gas reservoir, at the injection depth); (3) the (2007–2010) 2–3 folds increase in injection rate precedes 2013, 2016 top largest events; and (4) 75% of the 2013–2016 events cluster within 4–8 km depths, that is, close to and downward the 4.5 km deep injection well. As quantified by changepoint analysis, our results suggest that timely overlaps between injection operations and seismicity patterns are as decisive as extraction operations to control the Lacq seismicity. The seismicity onset is contemporary to cumulative stress changes (induced by depletion and injection operations) in the 0.1–1 MPa range. The interrelation between injection and extraction is the most probable cause of the Lacq seismicity onset and is sustenance over time. The injected volume–largest magnitude pair for Lacq field is in the same range (90% confidence level) than wastewater volume–magnitude pairs reported worldwide, in a wide variety of tectonic settings.


2021 ◽  
Author(s):  
Michael Hollaway ◽  
Peter Henrys ◽  
Rebecca Killick ◽  
Amber Leeson ◽  
John Watkins

&lt;p&gt;&amp;#160; &amp;#160; &amp;#160;Numerical models are essential tools for understanding the complex and dynamic nature of the natural environment and how it will respond to a changing climate. With ever increasing volumes of environmental data and increased availability of high powered computing, these models are becoming more complex and detailed in nature. Therefore the ability of these models to represent reality is critical in their use and future development. This has presented a number of challenges, including providing research platforms for collaborating scientists to explore big data, develop and share new methods, and communicate their results to stakeholders and decision makers. This work presents an example of a cloud-based research platform known as DataLabs and how it can be used to simplify access to advanced statistical methods (in this case changepoint analysis) for environmental science applications.&lt;/p&gt;&lt;p&gt;&amp;#160;&amp;#160;&amp;#160;&amp;#160; A combination of changepoint analysis and fuzzy logic is used to assess the ability of numerical models to capture local scale temporal events seen in observations. The fuzzy union based metric factors in uncertainty of the changepoint location to calculate individual similarity scores between the numerical model and reality for each changepoint in the observed record. The application of the method is demonstrated through a case study on a high resolution model dataset which was able to pick up observed changepoints in temperature records over Greenland to varying degrees of success. The case study is presented using the DataLabs framework, demonstrating how the method can be shared with other users of the platform and the results visualised and communicated to users of different areas of expertise.&lt;/p&gt;


2021 ◽  
pp. 1-12
Author(s):  
Daniel Kent ◽  
James D. Wilson ◽  
Skyler J. Cranmer

Abstract Across the social sciences, scholars regularly pool effects over substantial periods of time, a practice that produces faulty inferences if the underlying data generating process is dynamic. To help researchers better perform principled analyses of time-varying processes, we develop a two-stage procedure based upon techniques for permutation testing and statistical process monitoring. Given time series cross-sectional data, we break the role of time through permutation inference and produce a null distribution that reflects a time-invariant data generating process. The null distribution then serves as a stable reference point, enabling the detection of effect changepoints. In Monte Carlo simulations, our randomization technique outperforms alternatives for changepoint analysis. A particular benefit of our method is that, by establishing the bounds for time-invariant effects before interacting with actual estimates, it is able to differentiate stochastic fluctuations from genuine changes. We demonstrate the method’s utility by applying it to a popular study on the relationship between alliances and the initiation of militarized interstate disputes. The example illustrates how the technique can help researchers make inferences about where changes occur in dynamic relationships and ask important questions about such changes.


JAMIA Open ◽  
2021 ◽  
Vol 4 (1) ◽  
Author(s):  
John Del Gaizo ◽  
Ken R Catchpole ◽  
Alexander V Alekseyenko

Abstract Motivation Research & Exploratory Analysis Driven Time-data Visualization (read-tv) is an open source R Shiny application for visualizing irregularly and regularly spaced longitudinal data. read-tv provides unique filtering and changepoint analysis (CPA) features. The need for these analyses was motivated by research of surgical work-flow disruptions in operating room settings. Specifically, for the analysis of the causes and characteristics of periods of high disruption-rates, which are associated with adverse surgical outcomes. Materials and Methods read-tv is a graphical application, and the main component of a package of the same name. read-tv generates and evaluates code to filter and visualize data. Users can view the visualization code from within the application, which facilitates reproducibility. The data input requirements are simple, a table with a time column with no missing values. The input can either be in the form of a file, or an in-memory dataframe– which is effective for rapid visualization during curation. Results We used read-tv to automatically detect surgical disruption cascades. We found that the most common disruption type during a cascade was training, followed by equipment. Discussion read-tv fills a need for visualization software of surgical disruptions and other longitudinal data. Every visualization is reproducible, the exact source code that read-tv executes to create a visualization is available from within the application. read-tv is generalizable, it can plot any tabular dataset given the simple requirements that there is a numeric, datetime, or datetime string column with no missing values. Finally, the tab-based architecture of read-tv is easily extensible, it is relatively simple to add new functionality by implementing a tab in the source code. Conclusion read-tv enables quick identification of patterns through customizable longitudinal plots; faceting; CPA; and user-specified filters. The package is available on GitHub under an MIT license.


2021 ◽  
Author(s):  
◽  
Phirime Monyeki

When South Africa is compared to other countries, it has a notably high rate of crime. The country has seen a concomitantly high occurrence of murder, residential burglary, drug-related crime and carjacking (hijacking) crime. The government is desperately seeking solutions that can be implemented to reduce recurrent crime. Several reasons to explicate high crime trends in different areas include alcohol or drug abuse, low standards of education, poor parenting skills and a lack of social and vocational skills. This study aimed to gain better insight into crime trends in South Africa using data mining techniques. Decision-making linked to the data could help the government implement a coherent crime strategy to mitigate crime. The crime dataset chosen for this study was publicly available at kaggle.com. The dataset was prepared using Python programming code. The research design was utilised as an overall strategy to compile all different components of this study with an intention of answering the research questions and attaining the research objectives. To identify the significant changes, ChangePoint Analysis (CPA) was performed to pinpoint the abrupt change in the South African crime dataset. Two methods called Cumulative Sum (CUSUM) and Bootstrap were implemented in this study of CPA. To analyse the trend of data, CUSUM and Bootstrap were performed to measure the occurrence of change points based on the confidence levels. The CPA outcome depicted multiple significant changes and abrupt shifts in several provinces of South Africa. Linear regression (LR) was utilised to predict the future trends of crime in South Africa from 2016 – 2022 based on the erstwhile 2005 – 2015 crime statistics. The results showed that crime has been on the increase in South Africa with certain provinces such as Western Cape, Gauteng and KwaZulu-Natal being identified as crime hotspots. Future studies on crime should focus only on one province to gain insight into the dominating crimes and hotspots within that particular province, with a view to developing highly specific crime-reduction interventions.


Sign in / Sign up

Export Citation Format

Share Document