scholarly journals Sensitivity analysis of the Chaohu Lake eutrophication model with a new index based on the Morris method

2017 ◽  
Vol 18 (4) ◽  
pp. 1375-1387 ◽  
Author(s):  
Yulin Wang ◽  
Zulin Hua ◽  
Liang Wang

Abstract Chaohu Lake is a large shallow lake in eastern China, and few eutrophication model studies have been conducted there. We present practical sensitivity indices based on the Morris method to compare the sensitivity of a parameter group on one model output with that of one parameter on multiple model outputs. The new sensitivity indices were employed to measure the parameter sensitivity of the Chaohu Lake eutrophication model. The results of the sensitivity analysis demonstrate that the most sensitive parameters on cyanobacteria biomass, NH4, NO3, and PO4 were BMR, KDN, Nitm, and KRP, and the most sensitive parameter groups were algae-related, nitrogen-related, and phosphorus-related, which all directly participate in their cycles. Furthermore, Nitm, KRP, KDN, KHP, BMR, KTB, KTHDR, and KTCOD were the most important for the Chaohu Lake eutrophication model. The water environment characteristics, such as the cyanobacteria life stage in the simulated period, significantly affected parameter sensitivity. The power-law relationship between the new sensitivity index and the standard deviation of model variables in the Chaohu Lake model were also determined. This finding allows us to estimate the interactions between parameters using their sensitivity index. The results provide a basis for further improvement of the Chaohu Lake eutrophication model.

2020 ◽  
Author(s):  
Monica Riva ◽  
Aronne Dell'Oca ◽  
Alberto Guadagnini

<p>Modern models of environmental and industrial systems have reached a relatively high level of complexity. The latter aspect could hamper an unambiguous understanding of the functioning of a model, i.e., how it drives relationships and dependencies among inputs and outputs of interest. Sensitivity Analysis tools can be employed to examine this issue.</p><p>Global sensitivity analysis (GSA) approaches rest on the evaluation of sensitivity across the entire support within which system model parameters are supposed to vary. In this broad context, it is important to note that the definition of a sensitivity metric must be linked to the nature of the question(s) the GSA is meant to address. These include, for example: (i) which are the most important model parameters with respect to given model output(s)?; (ii) could we set some parameter(s) (thus assisting model calibration) at prescribed value(s) without significantly affecting model results?; (iii) at which space/time locations can one expect the highest sensitivity of model output(s) to model parameters and/or knowledge of which parameter(s) could be most beneficial for model calibration?</p><p>The variance-based Sobol’ Indices (e.g., Sobol, 2001) represent one of the most widespread GSA metrics, quantifying the average reduction in the variance of a model output stemming from knowledge of the input. Amongst other techniques, Dell’Oca et al. [2017] proposed a moment-based GSA approach which enables one to quantify the influence of uncertain model parameters on the (statistical) moments of a target model output.</p><p>Here, we embed in these sensitivity indices the effect of uncertainties both in the system model conceptualization and in the ensuing model(s) parameters. The study is grounded on the observation that physical processes and natural systems within which they take place are complex, rendering target state variables amenable to multiple interpretations and mathematical descriptions. As such, predictions and uncertainty analyses based on a single model formulation can result in statistical bias and possible misrepresentation of the total uncertainty, thus justifying the assessment of multiple model system conceptualizations. We then introduce copula-based sensitivity metrics which allow characterizing the global (with respect to the input) value of the sensitivity and the degree of variability (across the whole range of the input values) of the sensitivity for each value that the prescribed model output can possibly undertake, as driven by a governing model. In this sense, such an approach to sensitivity is global with respect to model input(s) and local with respect to model output, thus enabling one to discriminate the relevance of an input across the entire range of values of the modeling goal of interest. The methodology is demonstrated in the context of flow and reactive transport scenarios.</p><p> </p><p><strong>References</strong></p><p>Sobol, I. M., 2001. Global sensitivity indices for nonlinear mathematical models and their Monte Carlo estimates. Math. Comput. Sim., 55, 271-280.</p><p>Dell’Oca, A., Riva, M., Guadagnini, A., 2017. Moment-based metrics for global sensitivity analysis of hydrological systems. Hydr. Earth Syst. Sci., 21, 6219-6234.</p>


Agriculture ◽  
2019 ◽  
Vol 9 (2) ◽  
pp. 37 ◽  
Author(s):  
Xenia Specka ◽  
Claas Nendel ◽  
Ralf Wieland

Sensitivity analysis (SA) is often applied to evaluate the behavior of ecological models in which the integrated soil and crop processes often vary over time. In this study, the time dependence of the parameter sensitivity of a process-based agro-ecosystem model was analyzed for various sites and model outputs. We applied the Morris screening and extended FAST methods by calculating daily sensitivity measures. By analyzing the daily elementary effects using the Morris method, we were able to identify more sensitive parameters compared with the original approach. The temporal extension of the extended FAST method revealed changes in parameter sensitivity during the simulation time. In addition to the dynamic parameter sensitivity, we noticed different relationships between parameter sensitivity and simulation time. The temporal SA performed in this study improves our understanding of the investigated model’s behavior and demonstrates the importance of analyzing the sensitivity of ecological models over the entire simulation time.


2012 ◽  
Vol 44 (2) ◽  
pp. 334-350 ◽  
Author(s):  
Fiachra O'Loughlin ◽  
Michael Bruen ◽  
Thorsten Wagener

Although ongoing technological advances have alleviated data restrictions and most of the computational barriers to distributed modelling, lumped, parsimonious, conceptual and rainfall-runoff models are still widely used for flood forecasting. However both optimum parameter values and the fluxes of water through individual model components change significantly with the time-step used. Thus, such models should be used with caution in applications such as hydrograph separation or water quality studies that require the fluxes through individual flow routes through the model or which try to relate parameters to physical features of the catchment. To demonstrate this time-scale limitation, a parameter sensitivity analysis was performed on the lumped conceptual Soil Moisture Accounting and Routing with Groundwater component (SMARG) model for a 182 km2 rural catchment in Ireland for a number of time-steps, flow regimes and evaluation metrics. A global sensitivity analysis method (Higher Dimensional Model Representation) showed that sensitivity indices vary greatly with time-step and evaluation metric. The sensitivity of parameters also varied for different flow regimes. Certain parameters' sensitivities remain fairly constant across both flow regimes and time-step, while others are very much regime or time-step dependent. Care should be taken in using internal information from conceptual models because of this strong dependence on time-step.


2019 ◽  
Vol 2019 ◽  
pp. 1-19 ◽  
Author(s):  
Xiao Wei ◽  
Haichao Chang ◽  
Baiwei Feng ◽  
Zuyuan Liu

In order to truly reflect the ship performance under the influence of uncertainties, uncertainty-based design optimization (UDO) for ships that fully considers various uncertainties in the early stage of design has gradually received more and more attention. Meanwhile, it also brings high dimensionality problems, which may result in inefficient and impractical optimization. Sensitivity analysis (SA) is a feasible way to alleviate this problem, which can qualitatively or quantitatively evaluate the influence of the model input uncertainty on the model output, so that uninfluential uncertain variables can be determined for the descending dimension to achieve dimension reduction. In this paper, polynomial chaos expansions (PCE) with less computational cost are chosen to directly obtain Sobol' global sensitivity indices by its polynomial coefficients; that is, once the polynomial of the output variable is established, the analysis of the sensitivity index is only the postprocessing of polynomial coefficients. Besides, in order to further reduce the computational cost, for solving the polynomial coefficients of PCE, according to the properties of orthogonal polynomials, an improved probabilistic collocation method (IPCM) based on the linear independence principle is proposed to reduce sample points. Finally, the proposed method is applied to UDO of a bulk carrier preliminary design to ensure the robustness and reliability of the ship.


2020 ◽  
Author(s):  
Sabine M. Spiessl ◽  
Sergei Kucherenko

<p>Probabilistic methods of higher order sensitivity analysis provide a possibility for identifying parameter interactions by means of sensitivity indices. Better understanding of parameter interactions may help to better quantify uncertainties of repository models, which can behave in a highly nonlinear, non-monotonic or even discontinuous manner. Sensitivity indices can efficiently be estimated by the Random-Sampling High Dimensional Model Representation (RS-HDMR) metamodeling approach. This approach is based on truncating the ANOVA-HDMR expansions up to the second order, while the truncated terms are then approximated by orthonormal polynomials. By design, the sensitivity index of total order (SIT) in this method is approximated as the sum of the indices of first order (SI1’s) plus all corresponding indices of second order (SI2’s) for a considered parameter. RS-HDMR belongs to a wider class of methods known as polynomial chaos expansion (PCE). PCE methods are based on Wiener’s homogeneous chaos theory published in 1938. It is a widely used approach in metamodeling. Usually only a few terms are relevant in the PCE structure. The Bayesian Sparse PCE method (BSPCE) makes use of sparse PCE. Using BSPCE, SI1 and SIT can be estimated. In this work we used the SobolGSA software [1] which contains both the RS-HDMR and BSPCE methods.</p><p>We have analysed the sensitivities of a model for a generic LILW repository in a salt mine using both the RS-HDMR and the BSPCE approach. The model includes a barrier in the near field which is chemically dissolved (corroded) over time by magnesium-containing brine, resulting in a sudden significant change of the model behaviour and usually a rise of the radiation exposure. We investigated the model with two sets of input parameters: one with 6 parameters and one with 5 additional ones (LILW6 and LILW11 models, respectively). For the time-dependent analysis, 31 time points were used.</p><p>The SI1 indices calculated with both approaches agree well with those obtained from the well-established and reliable first-order algorithm EASI [2] in most investigations. The SIT indices obtained from the BSPCE method seem to increase with the number of simulations used to build the metamodel. The SIT time curves obtained from the RS-HDMR approach with optimal choice of the polynomial coefficients agree well with the ones from the BSPCE approach only for relatively low numbers of simulations. As, in contrast to RS-HDMR, the BSPCE approach takes account of all orders of interaction, this may be a hint for the existence of third- or higher-order effects.</p><p><strong>Acknowledgements</strong></p><p>The work was financed by the German Federal Ministry for Economic Affairs and Energy (BMWi). We would also like to thank Dirk-A. Becker for his constructive feedback.</p><p><strong>References</strong></p><p>[1]         S. M. Spiessl, S. Kucherenko, D.-A. Becker, O. Zaccheus, Higher-order sensitivity analysis of a final repository model with discontinuous behaviour. Reliability Engineering and System Safety, doi: https://doi.org/10.1016/j.ress.2018.12.004, (2018).</p><p>[2]          E. Plischke, An effective algorithm for computing global sensitivity indices (EASI). Reliability Engineering and System Safety, 95: 354–360, (2010).</p>


2014 ◽  
Vol 7 (3) ◽  
pp. 3867-3888 ◽  
Author(s):  
M. Liu ◽  
B. He ◽  
A. Lü ◽  
L. Zhou ◽  
J. Wu

Abstract. Parameters sensitivity analysis is a crucial step in effective model calibration. It quantitatively apportions the variation of model output to different sources of variation, and identifies how "sensitive" a model is to changes in the values of model parameters. Through calibration of parameters that are sensitive to model outputs, parameter estimation becomes more efficient. Due to uncertainties associated with yield estimates in a regional assessment, field-based models that perform well at field scale are not accurate enough to model at regional scale. Conducting parameters sensitivity analysis at the regional scale and analyzing the differences of parameter sensitivity between stations would make model calibration and validation in different sub-regions more efficient. Further, it would benefit the model applied to the regional scale. Through simulating 2000 × 22 samples for 10 stations in the Huanghuaihai Plain, this study discovered that TB (Optimal temperature), HI (Normal harvest index), WA (Potential radiation use efficiency), BN2 (Normal fraction of N in crop biomass at mid-season) and RWPC1 (Fraction of root weight at emergency) are more sensitive than other parameters. Parameters that determine nutrition supplement and LAI development have higher global sensitivity indices than first-order indices. For spatial application, soil diversity is crucial because soil is responsible for crop parameters sensitivity index differences between sites.


Geosciences ◽  
2019 ◽  
Vol 9 (5) ◽  
pp. 220
Author(s):  
Khalid Oubennaceur ◽  
Karem Chokmani ◽  
Miroslav Nastev ◽  
Yves Gauthier ◽  
Jimmy Poulin ◽  
...  

A new method for sensitivity analysis of water depths is presented based on a two-dimensional hydraulic model as a convenient and cost-effective alternative to Monte Carlo simulations. The method involves perturbation of the probability distribution of input variables. A relative sensitivity index is calculated for each variable, using the Gauss quadrature sampling, thus limiting the number of runs of the hydraulic model. The variable-related highest variation of the expected water depths is considered to be the most influential. The proposed method proved particularly efficient, requiring less information to describe model inputs and fewer model executions to calculate the sensitivity index. It was tested over a 45 km long reach of the Richelieu River, Canada. A 2D hydraulic model was used to solve the shallow water equations (SWE). Three input variables were considered: Flow rate, Manning’s coefficient, and topography of a shoal within the considered reach. Four flow scenarios were simulated with discharge rates of 759, 824, 936, and 1113 m 3 / s . The results show that the predicted water depths were most sensitive to the topography of the shoal, whereas the sensitivity indices of Manning’s coefficient and the flow rate were comparatively lower. These results are important for making better hydraulic models, taking into account the sensitivity analysis.


Sign in / Sign up

Export Citation Format

Share Document