scholarly journals A Data-Based Framework for Identifying a Source Location of a Contaminant Spill in a River System with Random Measurement Errors

Sensors ◽  
2019 ◽  
Vol 19 (15) ◽  
pp. 3378 ◽  
Author(s):  
Jun Hyeong Kim ◽  
Mi Lim Lee ◽  
Chuljin Park

This study addresses the problem of identifying the source location of a contaminant spill in a river system when a sensor network returns observations containing random measurement errors. To solve this problem, we suggest a new framework comprising three main steps: (i) spill detection, (ii) data preprocessing, and (iii) source identification. Specifically, we applied a statistical process control chart to detect a contaminant spill with measurement errors while keeping the false alarm rate at less than or equal to a user-specified value. After detecting a spill, we generated a nonlinear regression model to estimate a breakthrough curve of the observations and derive a characteristic vector of the estimated curve. Using the characteristic vector as an input, a random forest model was constructed with the sensor raising the first alarm. The model provides output values between 0 and 1 to represent the possibility of each candidate location being the true spill source. These possibility values allow users to identify strong candidate locations for the spill. The accuracy of our framework was tested on part of the Altamaha River system in Georgia, USA.

1982 ◽  
Vol 26 ◽  
pp. 11-24 ◽  
Author(s):  
Allan Brown

Different procedures used in precision measurements of lattice parameters are, strictly, only valid if they can be shown to give results that are mutually reproducible. For this purpose reproducibility is defined in terms of the parameters a. and standard deviations a. obtained for X-ray specimens of one or more reference materials. The requirement is that all systematic errors should be minimized to a level below that of the random measurement errors. Where these have a Gaussian distribution the significance of the difference, Δa°, between two , measurements can then be Let;Led by evaluating . Thus, if K < 2 the difference, Δa°, cannot be distinguished from the effects of random measurement errors. This condition should be met for specimens of the same sample if reproducibility is good. For K ≥ 3 the value of Δa° is then taken to reflect real differences in the crystalline Jattice of two X-ray specimens of a given compound. A basis is thus created for the study of solid solubility and for the precise characterization of crystalline compounds.


Author(s):  
Gregory B. Baecher ◽  
Mark B. Jaksa ◽  
Peter I. Brooker ◽  
William S. Kaggwa

Author(s):  
Lynne H. Irwin ◽  
Cheryl A. Richter

In 1988 the Strategic Highway Research Program purchased four falling weight deflectometers (FWDs). During the acceptance testing it became evident that an improved procedure for calibration was needed to determine whether the specifications for the precision and the accuracy of the sensors were achieved. The authors were responsible for developing the procedure. This paper reports on the steps taken during the development of the calibration protocol. The reasons underlying the equipment and the procedures chosen are discussed. The sources of error in FWD measurements are identified, and ways that have been used to reduce those errors are reported. The goal was to reduce the systematic (bias) error to less than 0.3% through calibration. This level of error ensures that the random measurement error is larger than the systematic error for all pavement deflections less than 600 μm [24 mils (1 mil = 0.001 in.)]. Experience has shown that most highway pavements deflect less than 600 μm. The effects of FWD measurement errors on backcalculated pavement moduli are briefly reviewed. Verification of the protocol by several means showed that the calibration goal was achieved. Subsequent experience with the calibration protocol has shown that it has been effective and that it ensures high quality in the FWD data.


Sign in / Sign up

Export Citation Format

Share Document