Uncertainty Quantification in Modeling Metal Alloy Solidification

2017 ◽  
Vol 139 (8) ◽  
Author(s):  
Kyle Fezi ◽  
Matthew John M. Krane

Numerical simulations of metal alloy solidification are used to gain insight into physical phenomena that cannot be observed experimentally. These models produce results that are used to draw conclusions about a process or alloy and often compared to experimental results. However, uncertainty in model inputs cause uncertainty in model results, which have the potential to significantly affect conclusions drawn from their predictions. As a step toward understanding the effect of uncertain inputs on solidification modeling, uncertainty quantification (UQ) and sensitivity analysis are performed on a transient model of solidification of Al–4.5 wt % Cu in a rectangular cavity. The binary alloy considered has columnar solidification morphology, and this model solves equations for momentum, temperature, and species conservation. UQ and sensitivity analysis are performed for the degree of macrosegregation and solidification time. A Smolyak sparse grid algorithm is used to select input values to construct a polynomial response surface fit to model outputs. This polynomial is then used as a surrogate for the complete solidification model to determine the sensitivities and probability density functions (PDFs) of the model outputs. Uncertain model inputs of interest include the secondary dendrite arm spacing (SDAS), heat transfer coefficient, and material properties. The most influential input parameter for predicting the macrosegregation level is the dendrite arm spacing, which also strongly depends on the choice of permeability model. Additionally, the degree of uncertainty required to produce accurate predictions depends on the outputs of interest from the model.

Water ◽  
2020 ◽  
Vol 12 (2) ◽  
pp. 416
Author(s):  
Branwen Snelling ◽  
Stephen Neethling ◽  
Kevin Horsburgh ◽  
Gareth Collins ◽  
Matthew Piggott

Simulations of landslide generated waves (LGWs) are prone to high levels of uncertainty. Here we present a probabilistic sensitivity analysis of an LGW model. The LGW model was realised through a smooth particle hydrodynamics (SPH) simulator, which is capable of modelling fluids with complex rheologies and includes flexible boundary conditions. This LGW model has parameters defining the landslide, including its rheology, that contribute to uncertainty in the simulated wave characteristics. Given the computational expense of this simulator, we made use of the extensive uncertainty quantification functionality of the Dakota toolkit to train a Gaussian process emulator (GPE) using a dataset derived from SPH simulations. Using the emulator we conducted a variance-based decomposition to quantify how much each input parameter to the SPH simulation contributed to the uncertainty in the simulated wave characteristics. Our results indicate that the landslide’s volume and initial submergence depth contribute the most to uncertainty in the wave characteristics, while the landslide rheological parameters have a much smaller influence. When estimated run-up is used as the indicator for LGW hazard, the slope angle of the shore being inundated is shown to be an additional influential parameter. This study facilitates probabilistic hazard analysis of LGWs, because it reveals which source characteristics contribute most to uncertainty in terms of how hazardous a wave will be, thereby allowing computational resources to be focused on better understanding that uncertainty.


Author(s):  
Emmanuel Boafo ◽  
Emmanuel Numapau Gyamfi

Abstract Uncertainty and Sensitivity analysis methods are often used in severe accident analysis for validating the complex physical models employed in the system codes that simulate such scenarios. This is necessitated by the large uncertainties associated with the physical models and boundary conditions employed to simulate severe accident scenarios. The input parameters are sampled within defined ranges based on assigned probability distribution functions (PDFs) for the required number of code runs/realizations using stochastic sampling techniques. Input parameter selection is based on their importance to the key FOM, which is determined by the parameter identification and ranking table (PIRT). Sensitivity analysis investigates the contribution of each uncertain input parameter to the uncertainty of the selected FOM. In this study, the integrated severe accident analysis code MELCOR was coupled with DAKOTA, an optimization and uncertainty quantification tool in order to investigate the effect of input parameter uncertainty on hydrogen generation. The methodology developed was applied to the Fukushima Daiichi unit 1 NPP accident scenario, which was modelled in another study. The results show that there is approximately 22.46% uncertainty in the amount of hydrogen generated as estimated by a single MELCOR run given uncertainty in selected input parameters. The sensitivity analysis results also reveal that MELCOR input parameters; COR_SC 1141(Melt flow rate per unit width at breakthrough candling) , COR_ZP (Porosity of fuel debris beds) and COR_EDR (Characteristic debris size in core region) contributed most significantly to the uncertainty in hydrogen generation.


Algorithms ◽  
2020 ◽  
Vol 13 (7) ◽  
pp. 162
Author(s):  
Marion Gödel ◽  
Rainer Fischer ◽  
Gerta Köster

Microscopic crowd simulation can help to enhance the safety of pedestrians in situations that range from museum visits to music festivals. To obtain a useful prediction, the input parameters must be chosen carefully. In many cases, a lack of knowledge or limited measurement accuracy add uncertainty to the input. In addition, for meaningful parameter studies, we first need to identify the most influential parameters of our parametric computer models. The field of uncertainty quantification offers standardized and fully automatized methods that we believe to be beneficial for pedestrian dynamics. In addition, many methods come at a comparatively low cost, even for computationally expensive problems. This allows for their application to larger scenarios. We aim to identify and adapt fitting methods to microscopic crowd simulation in order to explore their potential in pedestrian dynamics. In this work, we first perform a variance-based sensitivity analysis using Sobol’ indices and then crosscheck the results by a derivative-based measure, the activity scores. We apply both methods to a typical scenario in crowd simulation, a bottleneck. Because constrictions can lead to high crowd densities and delays in evacuations, several experiments and simulation studies have been conducted for this setting. We show qualitative agreement between the results of both methods. Additionally, we identify a one-dimensional subspace in the input parameter space and discuss its impact on the simulation. Moreover, we analyze and interpret the sensitivity indices with respect to the bottleneck scenario.


2021 ◽  
Author(s):  
Michael Prime ◽  
Gavin Jones ◽  
Vicente Romero ◽  
Justin Winokur ◽  
Benjamin Schroeder

2017 ◽  
Vol 140 ◽  
pp. 06002 ◽  
Author(s):  
Antonio Olmedilla ◽  
Miha Založnik ◽  
Hervé Combeau

Sign in / Sign up

Export Citation Format

Share Document