Estimation of uncertainty in long term combined sewer sediment behaviour predictions, a UK case study

2008 ◽  
Vol 57 (9) ◽  
pp. 1405-1411 ◽  
Author(s):  
A. N. A. Schellart ◽  
F. A. Buijs ◽  
S. J. Tait ◽  
R. M. Ashley

There are regulatory driven requirements for UK water companies to reduce the number of properties at risk of sewer flooding. One of the potential causes of sewer flooding is the presence of persistent sediment deposits in sewers. This is a common problem in many combined sewers. Although the regulation is risk based, there is a gap in current knowledge on how risk assessment is affected by the uncertainty in sewer solids behaviour prediction. This paper describes a UK case study exploring the possibility of estimating uncertainty in sewer sediment deposit level predictions, using Monte Carlo simulations combined with a response database.

2021 ◽  
Author(s):  
Francesca Pianosi ◽  
Andres Penuela-Fernandez ◽  
Christopher Hutton

<p>Proper consideration of uncertainty has become a cornerstone of model-informed planning of water resource systems. In the UK Government’s 2020 Water Resources Planning Guidelines, the word “uncertainty” appears 48 times in 82 pages. This emphasis on uncertainty aligns with the increasing adoption by UK water companies of a “risk-based” approach to their long-term decision-making, in order to handle uncertainties in supply-demand estimation, climate change, population growth, etc. The term “risk-based” covers a range of methods - such as “info-gap”, “robust decision-making” or “system sensitivity analysis” - that come under different names but largely share a common rationale, essentially based on the use of Monte Carlo simulation. This shift in thinking from previous (deterministic) “worst-case” approach to a “risk-based” one is important and has the potential to significantly improve water resources planning practice. However its implementation is diminished by a certain lack of clarity about the terminology in use and about the concrete differences (and similarities) among methods. On top of these difficulties, in the next planning-cycle (2021-2026) two further step changes are introduced: (1) water companies are requested to move from a cost-efficiency approach focused on achieving the supply-demand balance, towards a fully multi-criteria approach that more explicitly encompasses other objectives including environmental sustainability; (2) as a further way to handle long-term uncertainties, they are required to embrace an “adaptive planning” approach. These changes will introduce two new sets of uncertainties around the robust quantification of criteria, particularly environmental ones, and around the attribution of weights to different criteria. This urgently calls for establishing structured approaches to quantify not only the uncertainty in model outputs, but also the sensitivity of those outputs to different forms of uncertainty in the modelling chain that mostly control the variability of the final outcome – the “best value” plan. Without this understanding of critical uncertainties, the risk is that huge efforts are invested on characterising and/or reducing uncertainties that later turn out to have little impact on the final outcome; or that water managers fall back to using oversimplified representation of those uncertainties as a way to escape the huge modelling burden. In this work, we aim at starting to establish a common rationale to “risk-based” methods within the context of a fully multi-criteria approach. We use a proof-of-concept example of a reservoir system in the South-West of England to demonstrate the use of global (i.e. Monte Carlo based) sensitivity analysis to simultaneously quantify output uncertainty and sensitivity, and identify robust decisions. We also discuss the potential of this approach to inform the construction of a “decision tree” for adaptive planning.</p>


2019 ◽  
Vol 174 (1) ◽  
pp. 38-50 ◽  
Author(s):  
Patricia Ruiz ◽  
Claude Emond ◽  
Eva D McLanahan ◽  
Shivanjali Joshi-Barr ◽  
Moiz Mumtaz

Abstract Mixtures risk assessment needs an efficient integration of in vivo, in vitro, and in silico data with epidemiology and human studies data. This involves several approaches, some in current use and others under development. This work extends the Agency for Toxic Substances and Disease Registry physiologically based pharmacokinetic (PBPK) toolkit, available for risk assessors, to include a mixture PBPK model of benzene, toluene, ethylbenzene, and xylenes. The recoded model was evaluated and applied to exposure scenarios to evaluate the validity of dose additivity for mixtures. In the second part of this work, we studied toluene, ethylbenzene, and xylene (TEX)-gene-disease associations using Comparative Toxicogenomics Database, pathway analysis and published microarray data from human gene expression changes in blood samples after short- and long-term exposures. Collectively, this information was used to establish hypotheses on potential linkages between TEX exposures and human health. The results show that 236 genes expressed were common between the short- and long-term exposures. These genes could be central for the interconnecting biological pathways potentially stimulated by TEX exposure, likely related to respiratory and neuro diseases. Using publicly available data we propose a conceptual framework to study pathway perturbations leading to toxicity of chemical mixtures. This proposed methodology lends mechanistic insights of the toxicity of mixtures and when experimentally validated will allow data gaps filling for mixtures’ toxicity assessment. This work proposes an approach using current knowledge, available multiple stream data and applying computational methods to advance mixtures risk assessment.


2020 ◽  
Vol 6 (3) ◽  
pp. 196-209
Author(s):  
Christian N. Madu ◽  
Benjamin C. Ozumba ◽  
Chuhua Kuei ◽  
Ifeanyi E. Madu ◽  
Valentine E. Nnadi ◽  
...  

Objective: This paper uses the Analytic Hierarchy Process (AHP) to rank main actions and their associated task areas outlined in the Hyogo Framework for Action (HFA) in the case of Nigeria. The focus is on three major challenges namely (1) stakeholder inclusiveness, (2) capacity building and communication and (3) local adaptation. Methods: The perceptions of a sample of 26 field disaster management experts on the HFA were studied and analyzed using AHP. The study found that "Disaster Preparedness" is the most important expected goal followed by "Risk Assessment and Early Warning." Results: Their priority indices are 0.258 and 0.219, respectively. "Local/City Governance" however, shows poor performance with a priority index of 0.085. Monte Carlo simulation was further applied to examine the robustness of the AHP assessments. Conclusion: The results are indicative of the perceptions of the performance levels attained and the areas that need improvement.


Author(s):  
Gilbert Lim ◽  
Zhan Wei Lim ◽  
Dejiang Xu ◽  
Daniel S.W. Ting ◽  
Tien Yin Wong ◽  
...  

Ischemic stroke is a leading cause of death and long-term disability that is difficult to predict reliably. Retinal fundus photography has been proposed for stroke risk assessment, due to its non-invasiveness and the similarity between retinal and cerebral microcirculations, with past studies claiming a correlation between venular caliber and stroke risk. However, it may be that other retinal features are more appropriate. In this paper, extensive experiments with deep learning on six retinal datasets are described. Feature isolation involving segmented vascular tree images is applied to establish the effectiveness of vessel caliber and shape alone for stroke classification, and dataset ablation is applied to investigate model generalizability on unseen sources. The results suggest that vessel caliber and shape could be indicative of ischemic stroke, and sourcespecific features could influence model performance.


2021 ◽  
Author(s):  
M. Somasundaram ◽  
K.A. Mohamed Junaid ◽  
D. Sudha ◽  
Sabari L. Umamaheswari

Business leaders around the world are using emerging technologies to capitalize on data, to create business value and to compete effectively in a digitally driven world. Among them the risk assessment and the risk management, based on the assessment is a process which can be made using the available past historical data and applying Data Analytics. Although it is being implemented in different business domains, it is at a nascent stage. It is further new and emerging in the area of Education. This paper describes such a process followed in an educational institution of an engineering college and the use of data for risk management. Based on the processes followed, the performance of the students is seen to be improving in academic performance, placement, higher education and entrepreneurship. This also provides a good process and framework for taking strategic initiatives which will give long term benefits in the areas like research and outreach activities.


Author(s):  
Michel Grand Blanc ◽  
Andrea Carpignano ◽  
Sandra Dulla ◽  
Stefano Marolo

A new approach to assess the risk associated with jetfire and poolfire accidents on an offshore oil facility using cellular automata is presented. This model simulates many accident scenarios, and the related evacuation processes, adopting a Monte Carlo approach in order to evaluate an average consequence and then a more realistic value of the risk associated with these accidents. The results of this new method are discussed and compared with the results obtained from a traditional approach over a real case study. The comparison shows that this new approach supplies lower risk-related results, still being conservative; besides, it can supply further information useful in the design phase.


Sign in / Sign up

Export Citation Format

Share Document