Bayesian Uncertainty Integration for Model Calibration, Validation, and Prediction
This paper proposes a comprehensive approach to prediction under uncertainty by application to the Sandia National Laboratories verification and validation challenge problem. In this problem, legacy data and experimental measurements of different levels of fidelity and complexity (e.g., coupon tests, material and fluid characterizations, and full system tests/measurements) compose a hierarchy of information where fewer observations are available at higher levels of system complexity. This paper applies a Bayesian methodology in order to incorporate information at different levels of the hierarchy and include the impact of sparse data in the prediction uncertainty for the system of interest. Since separation of aleatory and epistemic uncertainty sources is a pervasive issue in calibration and validation, maintaining this separation in order to perform these activities correctly is the primary focus of this paper. Toward this goal, a Johnson distribution family approach to calibration is proposed in order to enable epistemic and aleatory uncertainty to be separated in the posterior parameter distributions. The model reliability metric approach to validation is then applied, and a novel method of handling combined aleatory and epistemic uncertainty is introduced. The quality of the validation assessment is used to modify the parameter uncertainty and add conservatism to the prediction of interest. Finally, this prediction with its associated uncertainty is used to assess system-level reliability (a prediction goal for the challenge problem).