scholarly journals Capturing variability in pavement performance models from sufficient time-series predictors: a case study of the New Brunswick road network

2011 ◽  
Vol 38 (2) ◽  
pp. 210-220 ◽  
Author(s):  
Luis Esteban Amador-Jiménez ◽  
Donath Mrawira

This paper proposes the use of multi-level Bayesian modeling for calibrating mechanistic model parameters from historical data while capturing reliability by estimating a desired confidence interval of the predictions. The model is capable of estimating the parameters from the observed data and expert criteria even in cases of missing data points. This approach allows rapid generation of several deterioration models without the need to partition the data into pavement families. It estimates posterior distributions for model coefficients and predicts values of the response for unobserved levels of the causal factors. A case study from the New Brunswick Department of Transportation is used to calibrate a simplified mechanistic pavement roughness progression model based on 6-year international roughness index (IRI) observations. The model incorporates the effects of pavement structural capacity in terms of deflection basin parameter (AREA) in place of the modified structural number, traffic loading (ESAL) and environmental factors. The results of the model showed that, as expected, chipseal roads have higher as built roughness and deteriorate faster than asphalt roads. Sensitivity analysis of the deterministic (the mean predictions) part of the model showed that in New Brunswick where traffic is relatively low the environment is the most important factor.

2011 ◽  
Vol 65 (4) ◽  
Author(s):  
Gheorghe Maria ◽  
Ionela Luta

AbstractBuilding-up a detailed kinetic model for drug release from various supports is a difficult task, especially when chemical reactions take place, accompanied by adsorption-desorption and diffusion steps. Often, semi-empirical release models derived from theoretical formulations of the transport process and system characteristics are employed. Their parameters have limited validity as they are dependent on the support, drug-ligand properties, and release conditions. However, they are often used for a quick simulation and design of drug delivery systems with a controlled release correlating the model parameters with the system characteristics and release conditions. Detailed information allows elaboration of an extended mechanistic model; the bias in the predictions introduced on various levels of model simplification is presented in this paper. A case study of a chemically activated ligand release in human plasma from a multivalent dendrimeric support is approached, pointing out the imprecision introduced by the gradual simplification of an extended model as well as the low reliability of the prediction when using various semi-empirical global models.


1999 ◽  
Vol 39 (10-11) ◽  
pp. 193-196
Author(s):  
J. Petersen ◽  
J. G. Petrie

The release of heavy metal species from deposits of solid waste materials originating from minerals processing operations poses a serious environmental risk should such species migrate beyond the boundaries of the deposit into the surrounding environment. Legislation increasingly places the liability for wastes with the operators of the process that generates them. The costs for long-term monitoring and clean-up following a potential critical leakage have to be factored in the overall project plan from the outset. Thus assessment of the potential for a particular waste material to generate a harmful leachate is directly relevant for estimating the environmental risk associated with the planned disposal operation. A rigorous mechanistic model is proposed, which allows prediction of the time-dependent generation of a leachate from a solid mineral waste deposit. Model parameters are obtained from a suitably designed laboratory waste assessment methodology on a relatively small sample of the prospective waste material. The parameters are not specific to the laboratory environment in which they were obtained but are valid also for full-scale heap modelling. In this way the model, combined with the assessment methodology, becomes a powerful tool for meaningful assessment of the risks associated with solid waste disposal strategies.


Water ◽  
2021 ◽  
Vol 13 (4) ◽  
pp. 463
Author(s):  
Gopinathan R. Abhijith ◽  
Leonid Kadinski ◽  
Avi Ostfeld

The formation of bacterial regrowth and disinfection by-products is ubiquitous in chlorinated water distribution systems (WDSs) operated with organic loads. A generic, easy-to-use mechanistic model describing the fundamental processes governing the interrelationship between chlorine, total organic carbon (TOC), and bacteria to analyze the spatiotemporal water quality variations in WDSs was developed using EPANET-MSX. The representation of multispecies reactions was simplified to minimize the interdependent model parameters. The physicochemical/biological processes that cannot be experimentally determined were neglected. The effects of source water characteristics and water residence time on controlling bacterial regrowth and Trihalomethane (THM) formation in two well-tested systems under chlorinated and non-chlorinated conditions were analyzed by applying the model. The results established that a 100% increase in the free chlorine concentration and a 50% reduction in the TOC at the source effectuated a 5.87 log scale decrement in the bacteriological activity at the expense of a 60% increase in THM formation. The sensitivity study showed the impact of the operating conditions and the network characteristics in determining parameter sensitivities to model outputs. The maximum specific growth rate constant for bulk phase bacteria was found to be the most sensitive parameter to the predicted bacterial regrowth.


Materials ◽  
2021 ◽  
Vol 14 (11) ◽  
pp. 2752
Author(s):  
Benedikt Finke ◽  
Clara Sangrós Sangrós Giménez ◽  
Arno Kwade ◽  
Carsten Schilde

In this paper, a widely mechanistic model was developed to depict the rheological behaviour of nanoparticulate suspensions with solids contents up to 20 wt.%, based on the increase in shear stress caused by surface interaction forces among particles. The rheological behaviour is connected to drag forces arising from an altered particle movement with respect to the surrounding fluid. In order to represent this relationship and to model the viscosity, a hybrid modelling approach was followed, in which mechanistic relationships were paired with heuristic expressions. A genetic algorithm was utilized during model development, by enabling the algorithm to choose among several hard-to-assess model options. By the combination of the newly developed model with existing models for the various physical phenomena affecting viscosity, it can be applied to model the viscosity over a broad range of solids contents, shear rates, temperatures and particle sizes. Due to its mechanistic nature, the model even allows an extrapolation beyond the limits of the data points used for calibration, allowing a prediction of the viscosity in this area. Only two parameters are required for this purpose. Experimental data of an epoxy resin filled with boehmite nanoparticles were used for calibration and comparison with modelled values.


Processes ◽  
2021 ◽  
Vol 9 (8) ◽  
pp. 1313
Author(s):  
Antonia Arroyo ◽  
Fabián Provoste ◽  
Montserrat Rodríguez ◽  
Ana L. Prieto

Polycyclic aromatic hydrocarbons (PAHs) are a family of organic compounds of widespread presence in the environment. They are recalcitrant, ubiquitous, prone to bioaccumulation, and potentially carcinogenic. Effluent from wastewater treatment plants (WWTPs) constitutes a major source of PAHs into water bodies, and their presence should be closely monitored, especially considering the increasing applications of potable and non-potable reuse of treated wastewater worldwide. Modeling the fate and distribution of PAHs in WWTPs is a valuable tool to overcome the complexity and cost of monitoring and quantifying PAHs. A mechanistic model was built to evaluate the fate of PAHs in both water and sludge lines of a Chilean WWTP. Naphthalene and benzo(a)pyrene were used as models of low-MW and high-MW PAHs. As there were no reported experimental data available for the case study, the influent load was determined through a statistical approach based on reported values worldwide. For both naphthalene and benzo(a)pyrene, the predominant mechanism in the water line was sorption to sludge, while that in the sludge line was desorption. Compared to other studies in the literature, the model satisfactorily describes the mechanisms involved in the fate and distribution of PAHs in a conventional activated sludge WWTP. Even though there is evidence of the presence of PAHs in urban centers in Chile, local regulatory standards do not consider PAHs in the disposal of WWTP effluents. Monitoring of PAHs in both treated effluents and biosolids is imperative, especially when considering de facto reuse and soil amendment in agricultural activities are currently practiced downstream of the studied WWTP.


2008 ◽  
Vol 10 (2) ◽  
pp. 153-162 ◽  
Author(s):  
B. G. Ruessink

When a numerical model is to be used as a practical tool, its parameters should preferably be stable and consistent, that is, possess a small uncertainty and be time-invariant. Using data and predictions of alongshore mean currents flowing on a beach as a case study, this paper illustrates how parameter stability and consistency can be assessed using Markov chain Monte Carlo. Within a single calibration run, Markov chain Monte Carlo estimates the parameter posterior probability density function, its mode being the best-fit parameter set. Parameter stability is investigated by stepwise adding new data to a calibration run, while consistency is examined by calibrating the model on different datasets of equal length. The results for the present case study indicate that various tidal cycles with strong (say, >0.5 m/s) currents are required to obtain stable parameter estimates, and that the best-fit model parameters and the underlying posterior distribution are strongly time-varying. This inconsistent parameter behavior may reflect unresolved variability of the processes represented by the parameters, or may represent compensational behavior for temporal violations in specific model assumptions.


2021 ◽  
pp. 1-14
Author(s):  
Zhenggang Wang ◽  
Jin Jin

Remote sensing image segmentation provides technical support for decision making in many areas of environmental resource management. But, the quality of the remote sensing images obtained from different channels can vary considerably, and manually labeling a mass amount of image data is too expensive and Inefficiently. In this paper, we propose a point density force field clustering (PDFC) process. According to the spectral information from different ground objects, remote sensing superpixel points are divided into core and edge data points. The differences in the densities of core data points are used to form the local peak. The center of the initial cluster can be determined by the weighted density and position of the local peak. An iterative nebular clustering process is used to obtain the result, and a proposed new objective function is used to optimize the model parameters automatically to obtain the global optimal clustering solution. The proposed algorithm can cluster the area of different ground objects in remote sensing images automatically, and these categories are then labeled by humans simply.


2018 ◽  
Vol 41 (1) ◽  
pp. 125-144 ◽  
Author(s):  
Rebecca Campbell ◽  
Rachael Goodman-Williams ◽  
Hannah Feeney ◽  
Giannina Fehler-Cabral

The purpose of this study was to develop triangulation coding methods for a large-scale action research and evaluation project and to examine how practitioners and policy makers interpreted both convergent and divergent data. We created a color-coded system that evaluated the extent of triangulation across methodologies (qualitative and quantitative), data collection methods (observations, interviews, and archival records), and stakeholder groups (five distinct disciplines/organizations). Triangulation was assessed for both specific data points (e.g., a piece of historical/contextual information or qualitative theme) and substantive findings that emanated from further analysis of those data points (e.g., a statistical model or a mechanistic qualitative assertion that links themes). We present five case study examples that explore the complexities of interpreting triangulation data and determining whether data are deemed credible and actionable if not convergent.


Sign in / Sign up

Export Citation Format

Share Document