scholarly journals Methodologies for compiling national inventories of contaminated sites and conducting preliminary site screening

2013 ◽  
Vol 16 (1) ◽  
pp. 24-35

<p>Procedures for compiling a national inventory of contaminated sites must take into account the technical state-of-the-art in the area of subsurface contamination and restoration, the national and supranational regulatory environment, as well as the national administrative infrastructure. Within this framework, this paper proposes a methodology of building a national inventory of potentially contaminated sites, which is based on activities of environmental relevance to the subsurface, i.e. soil and groundwater. As a next step, a screening system was developed, capable of estimating pollution potential of each site, for variable amount of available site-specific data. Depending on the nature of site data (actual or estimated) and the screening outcome, a site can be (i) delisted, (ii) assigned to an inactive list of potentially contaminated sites (iii) recommended for further desktop study and site visit or, (iv) recommended for both further study and in situ sampling. The advantage of the proposed approach is the identification of potentially contaminated sites on the basis of financial records linking activities with enterprises, which are more readily accessible compared with environmental records. The feasibility of transitioning from activities to sites has been demonstrated elsewhere. The present paper describes how data gaps are addressed by the site screening methodology with the aid of an application to a randomly selected real site in Greece.</p>

Author(s):  
Giovanni Pietro Beretta

The remediation of contaminated sites has been faced in Italy and elsewhere in the world with a series of works originated by the availability of specific technologies for the recovery of soils and groundwater quality, acting in accordance with the principle of sustainability. A framework of rules (target values and type of intervention) and a summary of the quality of soil and groundwater in Italian contaminated sites must be mentioned first. The design of the remediation was also permitted by the improvement of the site characterization, with specific equipments addressed for example to identify the stratigraphy of the contaminants, the presence of volatile compounds, the sampling of water of significant groundwater quality, etc.. The text describes some interventions relating to physical and hydraulic barriers that involve substantial capital and O&M costs and also the consumption of natural resources. Subsequently they are also considered important in situ interventions that resulted in a reduction in the concentration and significant recovery of the pollutants mass. The evolution of the residual concentration in the groundwater must be considered by monitoring natural attenuation. Despite the recovery of the mass of pollutants even up to 90-99%, values of cleanup (expected concentrations of the order of μg/L) which are established by national legislation have not been achieved. It can be stated that the scientific community is considering the new paradigm expressed by the “order of magnitude of the flow of pollutant mass” to replace the old paradigm consisting in the “limit value of final concentrations”.


Water ◽  
2021 ◽  
Vol 13 (5) ◽  
pp. 738
Author(s):  
Nicola Rossi ◽  
Mario Bačić ◽  
Meho Saša Kovačević ◽  
Lovorka Librić

The design code Eurocode 7 relies on semi-probabilistic calculation procedures, through utilization of the soil parameters obtained by in situ and laboratory tests, or by the means of transformation models. To reach a prescribed safety margin, the inherent soil parameter variability is accounted for through the application of partial factors to either soil parameters directly or to the resistance. However, considering several sources of geotechnical uncertainty, including the inherent soil variability, measurement error and transformation uncertainty, full probabilistic analyses should be implemented to directly consider the site-specific variability. This paper presents the procedure of developing fragility curves for levee slope stability and piping as failure mechanisms that lead to larger breaches, where a direct influence of the flood event intensity on the probability of failure is calculated. A range of fragility curve sets is presented, considering the variability of levee material properties and varying durations of the flood event, thus providing crucial insight into the vulnerability of the levee exposed to rising water levels. The procedure is applied to the River Drava levee, a site which has shown a continuous trend of increased water levels in recent years.


2021 ◽  
Vol 13 (10) ◽  
pp. 1985
Author(s):  
Emre Özdemir ◽  
Fabio Remondino ◽  
Alessandro Golkar

With recent advances in technologies, deep learning is being applied more and more to different tasks. In particular, point cloud processing and classification have been studied for a while now, with various methods developed. Some of the available classification approaches are based on specific data source, like LiDAR, while others are focused on specific scenarios, like indoor. A general major issue is the computational efficiency (in terms of power consumption, memory requirement, and training/inference time). In this study, we propose an efficient framework (named TONIC) that can work with any kind of aerial data source (LiDAR or photogrammetry) and does not require high computational power while achieving accuracy on par with the current state of the art methods. We also test our framework for its generalization ability, showing capabilities to learn from one dataset and predict on unseen aerial scenarios.


Sensors ◽  
2021 ◽  
Vol 21 (4) ◽  
pp. 1091
Author(s):  
Izaak Van Crombrugge ◽  
Rudi Penne ◽  
Steve Vanlanduit

Knowledge of precise camera poses is vital for multi-camera setups. Camera intrinsics can be obtained for each camera separately in lab conditions. For fixed multi-camera setups, the extrinsic calibration can only be done in situ. Usually, some markers are used, like checkerboards, requiring some level of overlap between cameras. In this work, we propose a method for cases with little or no overlap. Laser lines are projected on a plane (e.g., floor or wall) using a laser line projector. The pose of the plane and cameras is then optimized using bundle adjustment to match the lines seen by the cameras. To find the extrinsic calibration, only a partial overlap between the laser lines and the field of view of the cameras is needed. Real-world experiments were conducted both with and without overlapping fields of view, resulting in rotation errors below 0.5°. We show that the accuracy is comparable to other state-of-the-art methods while offering a more practical procedure. The method can also be used in large-scale applications and can be fully automated.


2021 ◽  
Vol 13 (14) ◽  
pp. 2848
Author(s):  
Hao Sun ◽  
Qian Xu

Obtaining large-scale, long-term, and spatial continuous soil moisture (SM) data is crucial for climate change, hydrology, and water resource management, etc. ESA CCI SM is such a large-scale and long-term SM (longer than 40 years until now). However, there exist data gaps, especially for the area of China, due to the limitations in remote sensing of SM such as complex topography, human-induced radio frequency interference (RFI), and vegetation disturbances, etc. The data gaps make the CCI SM data cannot achieve spatial continuity, which entails the study of gap-filling methods. In order to develop suitable methods to fill the gaps of CCI SM in the whole area of China, we compared typical Machine Learning (ML) methods, including Random Forest method (RF), Feedforward Neural Network method (FNN), and Generalized Linear Model (GLM) with a geostatistical method, i.e., Ordinary Kriging (OK) in this study. More than 30 years of passive–active combined CCI SM from 1982 to 2018 and other biophysical variables such as Normalized Difference Vegetation Index (NDVI), precipitation, air temperature, Digital Elevation Model (DEM), soil type, and in situ SM from International Soil Moisture Network (ISMN) were utilized in this study. Results indicated that: 1) the data gap of CCI SM is frequent in China, which is found not only in cold seasons and areas but also in warm seasons and areas. The ratio of gap pixel numbers to the whole pixel numbers can be greater than 80%, and its average is around 40%. 2) ML methods can fill the gaps of CCI SM all up. Among the ML methods, RF had the best performance in fitting the relationship between CCI SM and biophysical variables. 3) Over simulated gap areas, RF had a comparable performance with OK, and they outperformed the FNN and GLM methods greatly. 4) Over in situ SM networks, RF achieved better performance than the OK method. 5) We also explored various strategies for gap-filling CCI SM. Results demonstrated that the strategy of constructing a monthly model with one RF for simulating monthly average SM and another RF for simulating monthly SM disturbance achieved the best performance. Such strategy combining with the ML method such as the RF is suggested in this study for filling the gaps of CCI SM in China.


2020 ◽  
Vol 40 (04) ◽  
pp. 524-535
Author(s):  
Dmitry Y. Nechipurenko ◽  
Aleksey M. Shibeko ◽  
Anastasia N. Sveshnikova ◽  
Mikhail A. Panteleev

AbstractComputational physiology, i.e., reproduction of physiological (and, by extension, pathophysiological) processes in silico, could be considered one of the major goals in computational biology. One might use computers to simulate molecular interactions, enzyme kinetics, gene expression, or whole networks of biochemical reactions, but it is (patho)physiological meaning that is usually the meaningful goal of the research even when a single enzyme is its subject. Although exponential rise in the use of computational and mathematical models in the field of hemostasis and thrombosis began in the 1980s (first for blood coagulation, then for platelet adhesion, and finally for platelet signal transduction), the majority of their successful applications are still focused on simulating the elements of the hemostatic system rather than the total (patho)physiological response in situ. Here we discuss the state of the art, the state of the progress toward the efficient “virtual thrombus formation,” and what one can already get from the existing models.


Sign in / Sign up

Export Citation Format

Share Document