Statistical Methodology for a Quantified Validation of Sodium Fast Reactor Simulation Tools

Author(s):  
N. Marie ◽  
A. Marrel ◽  
K. Herbreteau

Abstract This paper presents a statistical methodology for a quantified validation of the OCARINa simulation tool, which models the unprotected transient overpower (UTOP) accidents. This validation on CABRI experiments is based on a best-estimate plus uncertainties (BEPU) approach. To achieve this, a general methodology based on recent statistical techniques is developed. In particular, a method for the quantification of multivariate data is applied for the visualization of simulator outputs and their comparison with experiments. Still for validation purposes, a probabilistic indicator is proposed to quantify the degree of agreement between the simulator OCARINa and the experiments, taking into account both experimental uncertainties and those on OCARINa inputs. Going beyond a qualitative validation, this work is of great interest for the verification, validation and uncertainty quantification or evaluation model development and assessment process approaches, which leads to the qualification of scientific calculation tools. Finally, for an in-depth analysis of the influence of uncertain parameters, a sensitivity analysis based on recent dependence measures is also performed. The usefulness of the statistical methodology is demonstrated on CABRI-E7 and CABRI-E12 tests. For each case, the BEPU propagation study is carried out performing 1000 Monte Carlo simulations with the OCARINa tool, with nine uncertain input parameters. The validation indicators provide a quantitative conclusion on the validation of the OCARINa tool on both transients and highlight future efforts to strengthen the demonstration of validation of safety tools. The sensitivity analysis improves the understanding of the OCARINa tool and the underlying UTOP scenario.

Author(s):  
Jun Liao ◽  
Scott E. Fortune

The systems computer code is a key part of the evaluation model for safety analysis of nuclear reactors. The systems code utilizes a set of governing equation that is simplified from the fundamental Navier-Stokes equations and closure models to describe the transport of mass, momentum, and energy of single phase or multiphase fluid throughout the reactor coolant systems. Following the Evaluation Model Development and Assessment Process, an assessment matrix is established where Separate Effects Tests and Integral Effects Tests are selected based on phenomena identification and ranking table. The purpose of the assessment matrix is to validate the systems code against the important phenomena for the safety analysis. The code biases and uncertainties are established and the effect of scale could then be determined. The assessment matrices of major systems codes, RELAP5/MOD3, TRACE Ver.5.0 and WCOBRA/TRAC-TF2, for the reactor safety analysis are reviewed and compared in this study for the Loss of Coolant Accident (LOCA) safety analysis perspectives. The scenarios are divided into small break LOCA and large break LOCA. The phenomena bases of the separate effects tests in those assessment matrices are discussed following its PIRT. The comparison demonstrates the capability of each systems code.


Author(s):  
Linyu Lin ◽  
Nam Dinh

Abstract In nuclear engineering, modeling and simulations (M&Ss) are widely applied to support risk-informed safety analysis. Since nuclear safety analysis has important implications, a convincing validation process is needed to assess simulation adequacy, i.e., the degree to which M&S tools can adequately represent the system quantities of interest. However, due to data gaps, validation becomes a decision-making process under uncertainties. Expert knowledge and judgments are required to collect, choose, characterize, and integrate evidence toward the final adequacy decision. However, in validation frameworks, CSAU: code scaling, applicability, and uncertainty (NUREG/CR-5249) and EMDAP: evaluation model development and assessment process regulatory guide (RG 1.203), such a decision-making process is largely implicit and obscure. When scenarios are complex, knowledge biases and unreliable judgments can be overlooked, which could increase uncertainty in the simulation adequacy result and the corresponding risks. Therefore, a framework is required to formalize the decision-making process for simulation adequacy in a practical, transparent, and consistent manner. This paper suggests a framework—“Predictive capability maturity quantification using Bayesian network (PCMQBN)”—as a quantified framework for assessing simulation adequacy based on information collected from validation activities. A case study is prepared for evaluating the adequacy of a Smoothed Particle Hydrodynamic simulation in predicting the hydrodynamic forces onto static structures during an external flooding scenario. Comparing to the qualitative and implicit adequacy assessment, PCMQBN is able to improve confidence in the simulation adequacy result and to reduce expected loss in the risk-informed safety analysis.


2013 ◽  
Vol 756-759 ◽  
pp. 715-719
Author(s):  
Huan Cheng Zhang ◽  
Ya Feng Yang ◽  
Feng Li ◽  
Li Nan Shi

In the College, performance evaluation system is directly related to the harmonious development of the school. Taking into account the factors in the evaluation system is fuzzy, so this paper uses fuzzy comprehensive evaluation model. But the model is too subjective, so this paper combines neural network and data envelopment analysis method, which ensures that fuzzy comprehensive evaluation model is reasonable and scientific, and good school development and teacher self-interest. The performance assessment process, not only enables the combination of qualitative and quantitative analysis, but also fair and reasonably reflect the achievements of teachers, while this method is easy to use, wide application, and can be well applied in practice.


2003 ◽  
Vol 53 (4) ◽  
pp. 478-488 ◽  
Author(s):  
Joseph R.V. Flora ◽  
Richard A. Hargis ◽  
William J. O’Dowd ◽  
Henry W. Pennline ◽  
Radisav D. Vidic

2021 ◽  
Author(s):  
Hyeyoung Koh ◽  
Hannah Beth Blum

This study presents a machine learning-based approach for sensitivity analysis to examine how parameters affect a given structural response while accounting for uncertainty. Reliability-based sensitivity analysis involves repeated evaluations of the performance function incorporating uncertainties to estimate the influence of a model parameter, which can lead to prohibitive computational costs. This challenge is exacerbated for large-scale engineering problems which often carry a large quantity of uncertain parameters. The proposed approach is based on feature selection algorithms that rank feature importance and remove redundant predictors during model development which improve model generality and training performance by focusing only on the significant features. The approach allows performing sensitivity analysis of structural systems by providing feature rankings with reduced computational effort. The proposed approach is demonstrated with two designs of a two-bay, two-story planar steel frame with different failure modes: inelastic instability of a single member and progressive yielding. The feature variables in the data are uncertainties including material yield strength, Young’s modulus, frame sway imperfection, and residual stress. The Monte Carlo sampling method is utilized to generate random realizations of the frames from published distributions of the feature parameters, and the response variable is the frame ultimate strength obtained from finite element analyses. Decision trees are trained to identify important features. Feature rankings are derived by four feature selection techniques including impurity-based, permutation, SHAP, and Spearman's correlation. Predictive performance of the model including the important features are discussed using the evaluation metric for imbalanced datasets, Matthews correlation coefficient. Finally, the results are compared with those from reliability-based sensitivity analysis on the same example frames to show the validity of the feature selection approach. As the proposed machine learning-based approach produces the same results as the reliability-based sensitivity analysis with improved computational efficiency and accuracy, it could be extended to other structural systems.


2021 ◽  
Vol 2021 ◽  
pp. 1-12
Author(s):  
Hua Yang ◽  
Huiying Wei ◽  
Xiang He ◽  
Yue Yan ◽  
Xiaoju Liu

With the rapid development of e-commerce technology, cross-channel consumption has become the mainstream mode of contemporary consumers. However, there are several problems of cross-channel consumption such as inconsistency of online and offline channel information and service, disfluency of channel switching which have brought adverse effects on user experience. The question arises here as to what factors influence user experience and how to build a scientific and effective evaluation index system. Different from previous studies based on sellers, this paper used grounded theory to analyze and summarize the evaluation index system of user experience under cross-channel consumption from the perspective of consumers. We summarized and refined four first level indexes which are “online platform attribute, offline entity attribute, channel switching attribute, and individual demand” and 13 second level indexes which are “platform operation, platform information, platform service, platform promotion, product quality, service quality, environment quality, channel consistency, channel switching cost, channel switching fluency, psychological expectation, personal interests and individual needs.” Then, we used BP neural network to build the evaluation model and trained and simulated the performance of the sample. The results show that the evaluation model has a good generalization ability and can effectively evaluate user experience under cross-channel consumption. Finally, implications and limitations are also discussed. This study helps to enrich the theoretical research on user experience and consumer behavior. It also provides targeted basis for in-depth analysis of cross-channel consumption behavior, establishment of user experience evaluation index system, and improving user experience and multichannel management of physical stores.


2010 ◽  
Vol 2 (2) ◽  
pp. 38-51 ◽  
Author(s):  
Marc Halbrügge

Keep it simple - A case study of model development in the context of the Dynamic Stocks and Flows (DSF) taskThis paper describes the creation of a cognitive model submitted to the ‘Dynamic Stocks and Flows’ (DSF) modeling challenge. This challenge aims at comparing computational cognitive models for human behavior during an open ended control task. Participants in the modeling competition were provided with a simulation environment and training data for benchmarking their models while the actual specification of the competition task was withheld. To meet this challenge, the cognitive model described here was designed and optimized for generalizability. Only two simple assumptions about human problem solving were used to explain the empirical findings of the training data. In-depth analysis of the data set prior to the development of the model led to the dismissal of correlations or other parametric statistics as goodness-of-fit indicators. A new statistical measurement based on rank orders and sequence matching techniques is being proposed instead. This measurement, when being applied to the human sample, also identifies clusters of subjects that use different strategies for the task. The acceptability of the fits achieved by the model is verified using permutation tests.


2010 ◽  
Vol 4 (2) ◽  
Author(s):  
Dietmar Winzker ◽  
Leon Pretorius

This paper elucidates the history, the design philosophy of innovation and the transformation of an old process-technology into a breakthrough, evidence-based therapy with international medical acceptance, verification of effectiveness as well as the strategic business model employed. Pulsed electromagnetic field therapy (PEMFT) was not medically acceptable and was, until recently in disrepute, professionally speaking. A revisiting of the technology with reference to the partially inconsistent, yet positive anecdotal results obtained, gave rise to in-depth analysis as well as scientific research conducted by independent institutions which resulted in the identification of the key physiological parameters which in turn could be related to a significant improvement of pathologies. By applying and promoting a systems approach as practiced by engineers who were involved in complex multidisciplinary projects for many years, a different perspective on the innovative development of PEMF therapy was established. The innovative process-based therapy working mainly at cellular and self-regulation level was a paradigmatic departure from the indication-based therapy as applied to pharmaceutical therapy. Over the past 10 years exceptional breakthroughs of the nonsymptom based therapy have been documented through clinical trials, scientific medical investigations and the publication of relevant literature. The turn-around of the old and insufficiently understood technology into an innovative, significant, scientific breakthrough-technology, requires a paradigm shift which is analogous to working in a different culture. It is surmised that this paradigm shift will strongly influence medical schools and practitioners over the next 5–10 years. The authors, as “outsiders” to the medical discipline, bring an engineering perspective to bear on the development of innovative but system-integrated medical devices which can promote the medical device industry and bring system engineering approaches into the realm of medical technology and therapy. Both authors have presented a number of papers at international conferences individually and in partnership on the topics of strategic business leadership and business transformation, system thinking and holistic management model development for high technology companies.


Sign in / Sign up

Export Citation Format

Share Document