A complementary modelling approach to manage uncertainty of computationally expensive models

2007 ◽  
Vol 56 (8) ◽  
pp. 1-9 ◽  
Author(s):  
Z. Vojinovic

The fact that the models applied in the ‘water domain’ are far from reality can be attributed to many reasons. In this context, a systematic analysis of uncertainties reflected by the model error can provide insight into the level of confidence in the model results and how to approach estimation of optimal model parameters. This paper discusses the four commonly used approaches for estimation of model parameters and suggests that an alternative complementary modelling approach should be considered in cases where the traditional model calibration gives limited results and particularly in cases where the computationally expensive models are concerned. It treats uncertainty as modelling the total discrepancy between the model and physical process. The proposed approach combines the results from a physically-based model and Support Vector Machine model into the final solution.

Author(s):  
James W.T Yates ◽  
Michael J Chappell ◽  
Julian W Gardner

A novel physically based mathematical model of carbon black/polymer vapour sensors is described, which incorporates parameters that have physical meaning. This model has an analytical solution and so requires negligible computational power to analyse a sensor's response to a particular form of input. Another advantage of this modelling approach is that the environmental dependencies of sensor responses may be compensated for and so help in the design of better pattern-recognition algorithms for electronic nose systems. This also means that the underlying chemistry of the sensors may be decoupled from their physical non-analyte specific properties. Experimentally, three different conducting nanocomposite polymers, poly(styrene- co -butadine), poly(ethyl- co -vinyl acetate) and poly(caprolactone), were tested. Each experiment consisted of separate exposures of the sensors to acetone and ethanol vapour in ambient air. A total of 336 such experiments were performed over a two-week period. The model was validated with respect to these data and was then fitted to the two vapour responses simultaneously, demonstrating its applicability to ‘real world’ systems. The temperature dependence of the model parameters was judged to be the most important factor and it needs to be compensated for when applying this type of sensor in practice.


Water ◽  
2021 ◽  
Vol 13 (7) ◽  
pp. 972
Author(s):  
Sotirios Moustakas ◽  
Patrick Willems

A variety of hydrological models is currently available. Many of those employ physically based formulations to account for the complexity and spatial heterogeneity of natural processes. In turn, they require a substantial amount of spatial data, which may not always be available at sufficient quality. Recently, a top-down approach for distributed rainfall-runoff modelling has been developed, which aims at combining accuracy and simplicity. Essentially, a distributed model with uniform model parameters (base model) is derived from a calibrated lumped conceptual model. Subsequently, selected parameters are disaggregated based on links with the available spatially variable catchment properties. The disaggregation concept is now adjusted to better account for non-linearities and extended to incorporate more model parameters (and, thus, larger catchment heterogeneity). The modelling approach is tested for a catchment including several flow gauging stations. The disaggregated model is shown to outperform the base model with respect to internal catchment dynamics, while performing similarly at the catchment outlet. Moreover, it manages to bridge on average 44% of the Nash–Sutcliffe efficiency difference between the base model and the lumped models calibrated for the internal gauging stations. Nevertheless, the aforementioned improvement is not necessarily sufficient for reliable model results.


2013 ◽  
Vol 753 ◽  
pp. 417-422 ◽  
Author(s):  
Kashif Rehman ◽  
Hatem S. Zurob

Microalloying additions are critical for grain size control during thermo-mechanical processing. The addition of niobium is known to delay the onset and growth of recrystallization. A physically-based model for the interaction of strain-induced precipitation, recovery and recrystallization is presented. A key feature of the model is the incorporation of the effect of precipitation on the nucleation of recrystallization. Quantitative agreement between the experimental measurements and the model predictions has also been demonstrated. The model offers valuable insight into the relative contributions of solute and precipitate Nb as well as the optimum conditions for strain accumulation.


2013 ◽  
Vol 59 (214) ◽  
pp. 327-344 ◽  
Author(s):  
Yves Lejeune ◽  
Jean-Maxime Bertrand ◽  
Patrick Wagnon ◽  
Samuel Morin

AbstractDebris-covered glaciers respond to atmospheric conditions in different ways from debris-free glaciers, due to the presence of debris at the surface during the ablation season and at the snow/ice interface during the accumulation season. Understanding the response of debris-covered glaciers to a variety of meteorological conditions in a physically sound manner is essential to quantify meltwater discharge and to predict their response to climate change. To tackle this issue, we developed the Crocus-DEB model as an adaptation of the detailed snowpack model Crocus, to simulate the energy and mass balance of debris-covered glaciers, including periods when debris is covered by snow. Crocus-DEB was evaluated with data gathered during a field experiment using artificial debris covering the snowpack at Col de Porte, France, with very good results in terms of conductive heat flux, both at the surface and at the interface between the debris and the underlying dense snow taken as a surrogate for ice, with and without snow overlying the debris. The model was also evaluated using field data from the debris-covered glacier Changri Nup, Nepal, Himalaya. This paper introduces the design of the model, its performance and its ability to explore relationships between model parameters, meteorological conditions and the critical debris thickness.


Author(s):  
M. SRINIVASAN ◽  
A. KRISHNAN

The hot spot temperature (HST) plays a most important role in the insulation life of the transformer. Ambient temperature and environmental variable factors involved in the top oil temperature (TOT) computations in all transformer thermal models affects insulation lifetime either directly or indirectly. The importance of the ambient temperature in transformer's insulation life, a new semi-physically-based model for the estimation of TOT in transformers has been proposed in this paper. The winding hot-spot temperature can be calculated as function of the TOT that can be estimated by using the ambient temperature, wind velocity and solar heat radiation effect and transformer loading measured data. The estimated TOT is compared with measured data of a distribution transformer in operation. The proposed model has been validated using real data gathered from a 100 MVA power transformer. For a semi-physically-based model to be acceptable, it must have the qualities of: adequacy, accuracy and consistency. We assess model adequacy using the scale: prediction R2, and plot of residuals against fitted values. To assess model consistency, we use: variance inflation factor (VIF) (which measure multicollinearity), condition number. To assess model accuracy we use mean square error, maximum and minimum error values of semi-physically-based model parameters to the existing model parameters.


2013 ◽  
Vol 52 (7) ◽  
pp. 1645-1663 ◽  
Author(s):  
Pierre-Emmanuel Kirstetter ◽  
Hervé Andrieu ◽  
Brice Boudevillain ◽  
Guy Delrieu

AbstractThe vertical profile of reflectivity (VPR) must be identified to correct estimations of rainfall rates by radar for the nonuniform beam filling associated with the vertical variation of radar reflectivity. A method for identifying VPRs from volumetric radar data is presented that takes into account the radar sampling. Physically based constraints on the vertical structure of rainfall are introduced with simple VPR models within a rainfall classification procedure defining more homogeneous precipitation patterns. The model parameters are identified in the framework of an extended Kalman filter to ensure their temporal consistency. The method is assessed using the dataset from a volume-scanning strategy for radar quantitative precipitation estimation designed in 2002 for the Bollène radar (France). The physical consistency of the retrieved VPR is evaluated. Positive results are obtained insofar as the physically based identified VPR (i) presents physically consistent shapes and characteristics considering beam effects, (ii) shows improved robustness in the difficult radar measurement context of the Cévennes–Vivarais region, and (iii) provides consistent physical insight into the rain field.


Sensors ◽  
2021 ◽  
Vol 21 (20) ◽  
pp. 6819
Author(s):  
Andrzej Brandyk ◽  
Bartosz Szeląg ◽  
Adam Kiczko ◽  
Marcin Krukowski ◽  
Adam Kozioł ◽  
...  

Soil moisture content simulation models have continuously been an important research objective. In particular, the comparisons of the performance of different model types deserve proper attention. Therefore, the quality of selected physically-based and statistical models was analyzed utilizing the data from the Time Domain Reflectometry technique. An E-Test measurement system was applied with the reflectogram interpreted into soil volumetric moisture content by proper calibration equations. The gathered data facilitated to calibrate the physical model of Deardorff and establish parameters of: support vector machines, multivariate adaptive regression spline, and boosted trees model. The general likelihood uncertainty estimation revealed the sensitivity of individual model parameters. As it was assumed, a simple structure of statistical models was achieved but no direct physical interpretation of their parameters, contrary to a physically-based method. The TDR technique proved useful for the calibration of different soil moisture models and a satisfactory quality for their future exploitation.


2019 ◽  
Author(s):  
Joseph John Pyne Simons ◽  
Ilya Farber

Not all transit users have the same preferences when making route decisions. Understanding the factors driving this heterogeneity enables better tailoring of policies, interventions, and messaging. However, existing methods for assessing these factors require extensive data collection. Here we present an alternative approach - an easily-administered single item measure of overall preference for speed versus comfort. Scores on the self-report item predict decisions in a choice task and account for a proportion of the differences in model parameters between people (n=298). This single item can easily be included on existing travel surveys, and provides an efficient method to both anticipate the choices of users and gain more general insight into their preferences.


2019 ◽  
Vol 19 (11) ◽  
pp. 2477-2495
Author(s):  
Ronda Strauch ◽  
Erkan Istanbulluoglu ◽  
Jon Riedel

Abstract. We developed a new approach for mapping landslide hazards by combining probabilities of landslide impacts derived from a data-driven statistical approach and a physically based model of shallow landsliding. Our statistical approach integrates the influence of seven site attributes (SAs) on observed landslides using a frequency ratio (FR) method. Influential attributes and resulting susceptibility maps depend on the observations of landslides considered: all types of landslides, debris avalanches only, or source areas of debris avalanches. These observational datasets reflect the detection of different landslide processes or components, which relate to different landslide-inducing factors. For each landslide dataset, a stability index (SI) is calculated as a multiplicative result of the frequency ratios for all attributes and is mapped across our study domain in the North Cascades National Park Complex (NOCA), Washington, USA. A continuous function is developed to relate local SI values to landslide probability based on a ratio of landslide and non-landslide grid cells. The empirical model probability derived from the debris avalanche source area dataset is combined probabilistically with a previously developed physically based probabilistic model. A two-dimensional binning method employs empirical and physically based probabilities as indices and calculates a joint probability of landsliding at the intersections of probability bins. A ratio of the joint probability and the physically based model bin probability is used as a weight to adjust the original physically based probability at each grid cell given empirical evidence. The resulting integrated probability of landslide initiation hazard includes mechanisms not captured by the infinite-slope stability model alone. Improvements in distinguishing potentially unstable areas with the proposed integrated model are statistically quantified. We provide multiple landslide hazard maps that land managers can use for planning and decision-making, as well as for educating the public about hazards from landslides in this remote high-relief terrain.


Sign in / Sign up

Export Citation Format

Share Document