SWToolbox: A surface-water tool-box for statistical analysis of streamflow time series

2018 ◽  
Author(s):  
Julie E. Kiang ◽  
Kate Flynn ◽  
Tong Zhai ◽  
Paul Hummel ◽  
Gregory Granato
Water ◽  
2019 ◽  
Vol 11 (4) ◽  
pp. 707 ◽  
Author(s):  
Shawn Dawley ◽  
Yong Zhang ◽  
Xiaoting Liu ◽  
Peng Jiang ◽  
Geoffrey Tick ◽  
...  

Hydrological extremes in the water cycle can significantly affect surface water engineering design, and represents the high-impact response of surface water and groundwater systems to climate change. Statistical analysis of these extreme events provides a convenient way to interpret the nature of, and interaction between, components of the water cycle. This study applies three probability density functions (PDFs), Gumbel, stable, and stretched Gaussian distributions, to capture the distribution of extremes and the full-time series of storm properties (storm duration, intensity, total precipitation, and inter-storm period), stream discharge, lake stage, and groundwater head values observed in the Lake Tuscaloosa watershed, Alabama, USA. To quantify the potentially non-stationary statistics of hydrological extremes, the time-scale local Hurst exponent (TSLHE) was also calculated for the time series data recording both the surface and subsurface hydrological processes. First, results showed that storm duration was most closely related to groundwater recharge compared to the other storm properties, while intensity also had a close relationship with recharge. These relationships were likely due to the effects of oversaturation and overland flow in extreme total precipitation storms. Second, the surface water and groundwater series were persistent according to the TSLHE values, because they were relatively slow evolving systems, while storm properties were anti-persistent since they were rapidly evolving in time. Third, the stretched Gaussian distribution was the most effective PDF to capture the distribution of surface and subsurface hydrological extremes, since this distribution can capture the broad transition from a Gaussian distribution to a power-law one.


2015 ◽  
Vol 51 (1) ◽  
pp. 198-212 ◽  
Author(s):  
Dylan J. Irvine ◽  
Roger H. Cranswick ◽  
Craig T. Simmons ◽  
Margaret A. Shanafield ◽  
Laura K. Lautz

Buildings ◽  
2021 ◽  
Vol 11 (1) ◽  
pp. 21
Author(s):  
Thomas Danel ◽  
Zoubeir Lafhaj ◽  
Anand Puppala ◽  
Sophie Lienard ◽  
Philippe Richard

This article proposes a methodology to measure the productivity of a construction site through the analysis of tower crane data. These data were obtained from a data logger that records a time series of spatial and load data from the lifting machine during the structural phase of a construction project. The first step was data collection, followed by preparation, which consisted of formatting and cleaning the dataset. Then, a visualization step identified which data was the most meaningful for the practitioners. From that, the activity of the tower crane was measured by extracting effective lifting operations using the load signal essentially. Having used such a sampling technique allows statistical analysis on the duration, load, and curvilinear distance of every extracted lifting operation. The build statistical distribution and indicators were finally used to compare construction site productivity.


2021 ◽  
Vol 13 (14) ◽  
pp. 2675
Author(s):  
Stefan Mayr ◽  
Igor Klein ◽  
Martin Rutzinger ◽  
Claudia Kuenzer

Fresh water is a vital natural resource. Earth observation time-series are well suited to monitor corresponding surface dynamics. The DLR-DFD Global WaterPack (GWP) provides daily information on globally distributed inland surface water based on MODIS (Moderate Resolution Imaging Spectroradiometer) images at 250 m spatial resolution. Operating on this spatiotemporal level comes with the drawback of moderate spatial resolution; only coarse pixel-based surface water quantification is possible. To enhance the quantitative capabilities of this dataset, we systematically access subpixel information on fractional water coverage. For this, a linear mixture model is employed, using classification probability and pure pixel reference information. Classification probability is derived from relative datapoint (pixel) locations in feature space. Pure water and non-water reference pixels are located by combining spatial and temporal information inherent to the time-series. Subsequently, the model is evaluated for different input sets to determine the optimal configuration for global processing and pixel coverage types. The performance of resulting water fraction estimates is evaluated on the pixel level in 32 regions of interest across the globe, by comparison to higher resolution reference data (Sentinel-2, Landsat 8). Results show that water fraction information is able to improve the product’s performance regarding mixed water/non-water pixels by an average of 11.6% (RMSE). With a Nash-Sutcliffe efficiency of 0.61, the model shows good overall performance. The approach enables the systematic provision of water fraction estimates on a global and daily scale, using only the reflectance and temporal information contained in the input time-series.


2020 ◽  
Vol 8 (6) ◽  
pp. 4590-4596

Monitoring high throughput distributed system by using a statistical analysis of the “historical time series” of an Instrumentation Data”. “The Pipeline has been made to process the information which can be otherwise called data pipeline, is a lot of information handling components associated in arrangement, where yield of one component is the contribution of the next one”. Several codes are giving different visualization for statistical analysis of data. “Network and Cloud Data Centers” generate a lot of data every second; this data can be gathered as period arrangement information. A timeseries is a grouping taken at progressive similarly dispersed focuses in time that implies at a particular time interval to a particular time, the estimations of explicit information that was taken is known as information of a time-series. “This time-series information can be gathered utilizing framework measurements like CPU, Memory, and Disk utilization”. The TICK and ELK Stack is abbreviation for a foundation of open source instruments worked “to make collection, storage, graphing, and alerting” on time arrangement data incredibly easy. As an information collector, using Telegraf, “for storing and analyzing” information and the time-series database InfluxDB and Elasticsearch. For plotting and visualizing used Grafana and Kibana. Watchman is utilized for alert refinement and once system metrics usage exceeds the specified threshold, the alert is generated and sends it to the Telegram.


Fractals ◽  
1995 ◽  
Vol 03 (04) ◽  
pp. 839-847 ◽  
Author(s):  
A. VESPIGNANI ◽  
A. PETRI ◽  
A. ALIPPI ◽  
G. PAPARO ◽  
M. COSTANTINI

Relaxation processes taking place after microfracturing of laboratory samples give rise to ultrasonic acoustic emission signals. Statistical analysis of the resulting time series has revealed many features which are characteristic of critical phenomena. In particular, the autocorrelation functions obey a power-law behavior, implying a power spectrum of the kind 1/f. Also the amplitude distribution N(V) of such signals follows a power law, and the obtained exponents are consistent with those found in other experiments: N(V) dV≃V–γ dV, with γ=1.7±0.2. We also analyzed the distribution N(τ) of the delay time τ between two consecutive acoustic emission events. We found that a N(τ) distribution rather close to a power law constitutes a common feature of all the recorded signals. These experimental results can be considered as a striking evidence for a critical dynamics underlying the microfracturing processes.


SIAM Review ◽  
1976 ◽  
Vol 18 (2) ◽  
pp. 313-314
Author(s):  
Andrey Feuerverger

Sign in / Sign up

Export Citation Format

Share Document