scholarly journals SBpipe: a collection of pipelines for automating repetitive simulation and analysis tasks

2017 ◽  
Author(s):  
Piero Dalle Pezze ◽  
Nicolas Le Novère

AbstractBackground: The rapid growth of the number of mathematical models in Systems Biology fostered the development of many tools to simulate and analyse them. The reliability and precision of these tasks often depend on multiple repetitions and they can be optimised if executed as pipelines. In addition, new formal analyses can be performed on these repeat sequences, revealing important insights about the accuracy of model predictions.Results: Here we introduce SBpipe, an open source software tool for automating repetitive tasks in model building and simulation. Using basic configuration files, SBpipe builds a sequence of repeated model simulations or parameter estimations, performs analyses from this generated sequence, and finally generates a LaTeX/PDF report. The parameter estimation pipeline offers analyses of parameter profile likelihood and parameter correlation using samples from the computed estimates. Specific pipelines for scanning of one or two model parameters at the same time are also provided. Pipelines can run on multicore computers, Sun Grid Engine (SGE), or Load Sharing Facility (LSF) clusters, speeding up the processes of model building and simulation. SBpipe can execute models implemented in Copasi, Python or coded in any other programming language using Python as a wrapper module. Future support for other software simulators can be dynamically added without affecting the current implementation.Conclusions: SBpipe allows users to automatically repeat the tasks of model simulation and parameter estimation, and extract robustness information from these repeat sequences in a solid and consistent manner, facilitating model development and analysis. The source code and documentation of this project are freely available at the web site: https://pdp10.github.io/sbpipe/.

Author(s):  
Marvin Zaluski ◽  
Sylvain Le´tourneau ◽  
Jeff Bird ◽  
Chunsheng Yang

The CF-18 aircraft is a complex system for which a variety of data are systematically being recorded: operational flight data from sensors and Built-In Test Equipment (BITE) and maintenance activities recorded by personnel. These data resources are stored and used within the operating organization but new analytical and statistical techniques and tools are being developed that could be applied to these data to benefit the organization. This paper investigates the utility of readily available CF-18 data to develop data mining-based models for prognostics and health management (PHM) systems. We introduce a generic data mining methodology developed to build prognostic models from operational and maintenance data and elaborate on challenges specific to the use of CF-18 data from the Canadian Forces. We focus on a number of key data mining tasks including: data gathering, information fusion, data pre-processing, model building, and evaluation. The solutions developed to address these tasks are described. A software tool developed to automate the model development process is also presented. Finally, the paper discusses preliminary results on the creation of models to predict F404 No. 4 Bearing and MFC (Main Fuel Control) failures on the CF-18.


Author(s):  
Marvin Zaluski ◽  
Sylvain Létourneau ◽  
Jeff Bird ◽  
Chunsheng Yang

The CF-18 (CF denotes Canadian Forces) aircraft is a complex system for which a variety of data are systematically being recorded: flight data from sensors, built-in test equipment data, and maintenance data. Without proper analytical and statistical tools, these data resources are of limited use to the operating organization. Focusing on data mining-based modeling, this paper investigates the use of readily available CF-18 data to support the development of prognostics and health management systems. A generic data mining methodology has been developed to build prognostic models from operational and maintenance data. This paper introduces the methodology and elaborates on challenges specific to the use of CF-18 data from the Canadian Forces. A number of key data mining tasks are examined including data gathering, information fusion, data preprocessing, model building, and model evaluation. The solutions developed to address these tasks are described. A software tool developed to automate the model development process is also presented. Finally, this paper discusses preliminary results on the creation of models to predict F404 no. 4 bearing and main fuel control failures on the CF-18.


PLoS ONE ◽  
2021 ◽  
Vol 16 (8) ◽  
pp. e0256227
Author(s):  
Rajnesh Lal ◽  
Weidong Huang ◽  
Zhenquan Li

Since the novel coronavirus (COVID-19) outbreak in China, and due to the open accessibility of COVID-19 data, several researchers and modellers revisited the classical epidemiological models to evaluate their practical applicability. While mathematical compartmental models can predict various contagious viruses’ dynamics, their efficiency depends on the model parameters. Recently, several parameter estimation methods have been proposed for different models. In this study, we evaluated the Ensemble Kalman filter’s performance (EnKF) in the estimation of time-varying model parameters with synthetic data and the real COVID-19 data of Hubei province, China. Contrary to the previous works, in the current study, the effect of damping factors on an augmented EnKF is studied. An augmented EnKF algorithm is provided, and we present how the filter performs in estimating models using uncertain observational (reported) data. Results obtained confirm that the augumented-EnKF approach can provide reliable model parameter estimates. Additionally, there was a good fit of profiles between model simulation and the reported COVID-19 data confirming the possibility of using the augmented-EnKF approach for reliable model parameter estimation.


2019 ◽  
Vol 15 (2) ◽  
pp. 277
Author(s):  
Sudiono Sudiono ◽  
S. H. Sutjahyo ◽  
P. Hidayat ◽  
R. Kurniawan

The purpose of this study was to develop a dynamic model of sustainable farming based on an integrated pest management system in upland vegetable crops in Tanggamus Regency, Lampung Province. Dynamic system analysis with powersim with the stages of model development, namely needs analysis, problem formulation, system identification, model simulation, and model testing. The results of the compilation of the dynamic model parameters, namely the current scenario (without intervention), in 2017 farmer households amounted to 104,929 households which increased in 2030 to 128,613 households farmers' income at the end of the simulation period to Rp 434,526,807 from a land area of 4,029 ha, scenario pessimistic in 2017 farmer households amounted to 100,753 households which experienced an increase in 2030 to 116,252 households with income in this scenario to Rp 470,170,405 from a land area of 4,243 ha, and an optimistic scenario for 2017 farmers' households amounting to 100,111 households that had increased in the year 2030 became 107,892 households with total farmer's income of Rp 508,916,172 on an area of 4,464 ha.


Entropy ◽  
2021 ◽  
Vol 23 (4) ◽  
pp. 387
Author(s):  
Yiting Liang ◽  
Yuanhua Zhang ◽  
Yonggang Li

A mechanistic kinetic model of cobalt–hydrogen electrochemical competition for the cobalt removal process in zinc hydrometallurgical was proposed. In addition, to overcome the parameter estimation difficulties arising from the model nonlinearities and the lack of information on the possible value ranges of parameters to be estimated, a constrained guided parameter estimation scheme was derived based on model equations and experimental data. The proposed model and the parameter estimation scheme have two advantages: (i) The model reflected for the first time the mechanism of the electrochemical competition between cobalt and hydrogen ions in the process of cobalt removal in zinc hydrometallurgy; (ii) The proposed constrained parameter estimation scheme did not depend on the information of the possible value ranges of parameters to be estimated; (iii) the constraint conditions provided in that scheme directly linked the experimental phenomenon metrics to the model parameters thereby providing deeper insights into the model parameters for model users. Numerical experiments showed that the proposed constrained parameter estimation algorithm significantly improved the estimation efficiency. Meanwhile, the proposed cobalt–hydrogen electrochemical competition model allowed for accurate simulation of the impact of hydrogen ions on cobalt removal rate as well as simulation of the trend of hydrogen ion concentration, which would be helpful for the actual cobalt removal process in zinc hydrometallurgy.


2017 ◽  
Vol 65 (4) ◽  
pp. 479-488 ◽  
Author(s):  
A. Boboń ◽  
A. Nocoń ◽  
S. Paszek ◽  
P. Pruski

AbstractThe paper presents a method for determining electromagnetic parameters of different synchronous generator models based on dynamic waveforms measured at power rejection. Such a test can be performed safely under normal operating conditions of a generator working in a power plant. A generator model was investigated, expressed by reactances and time constants of steady, transient, and subtransient state in the d and q axes, as well as the circuit models (type (3,3) and (2,2)) expressed by resistances and inductances of stator, excitation, and equivalent rotor damping circuits windings. All these models approximately take into account the influence of magnetic core saturation. The least squares method was used for parameter estimation. There was minimized the objective function defined as the mean square error between the measured waveforms and the waveforms calculated based on the mathematical models. A method of determining the initial values of those state variables which also depend on the searched parameters is presented. To minimize the objective function, a gradient optimization algorithm finding local minima for a selected starting point was used. To get closer to the global minimum, calculations were repeated many times, taking into account the inequality constraints for the searched parameters. The paper presents the parameter estimation results and a comparison of the waveforms measured and calculated based on the final parameters for 200 MW and 50 MW turbogenerators.


2015 ◽  
Vol 2015 ◽  
pp. 1-10 ◽  
Author(s):  
Xiao Zhang ◽  
Hongduo Zhao

The objective of this paper is to investigate the characterization of moisture diffusion inside early-age concrete slabs subjected to curing. Time-dependent relative humidity (RH) distributions of three mixture proportions subjected to three different curing methods (i.e., air curing, water curing, and membrane-forming compounds curing) and sealed condition were measured for 28 days. A one-dimensional nonlinear moisture diffusion partial differential equation (PDE) based on Fick’s second law, which incorporates the effect of curing in the Dirichlet boundary condition using a concept of curing factor, is developed to simulate the diffusion process. Model parameters are calibrated by a genetic algorithm (GA). Experimental results show that the RH reducing rate inside concrete under air curing is greater than the rates under membrane-forming compound curing and water curing. It is shown that the effect of water-to-cement (w/c) ratio on self-desiccation is significant. Lower w/c ratio tends to result in larger RH reduction. RH reduction considering both effect of diffusion and self-desiccation in early-age concrete is not sensitive to w/c ratio, but to curing method. Comparison between model simulation and experimental results indicates that the improved model is able to reflect the effect of curing on moisture diffusion in early-age concrete slabs.


Transport ◽  
2009 ◽  
Vol 24 (2) ◽  
pp. 135-142 ◽  
Author(s):  
Ali Payıdar Akgüngör ◽  
Erdem Doğan

This study proposes an Artificial Neural Network (ANN) model and a Genetic Algorithm (GA) model to estimate the number of accidents (A), fatalities (F) and injuries (I) in Ankara, Turkey, utilizing the data obtained between 1986 and 2005. For model development, the number of vehicles (N), fatalities, injuries, accidents and population (P) were selected as model parameters. In the ANN model, the sigmoid and linear functions were used as activation functions with the feed forward‐back propagation algorithm. In the GA approach, two forms of genetic algorithm models including a linear and an exponential form of mathematical expressions were developed. The results of the GA model showed that the exponential model form was suitable to estimate the number of accidents and fatalities while the linear form was the most appropriate for predicting the number of injuries. The best fit model with the lowest mean absolute errors (MAE) between the observed and estimated values is selected for future estimations. The comparison of the model results indicated that the performance of the ANN model was better than that of the GA model. To investigate the performance of the ANN model for future estimations, a fifteen year period from 2006 to 2020 with two possible scenarios was employed. In the first scenario, the annual average growth rates of population and the number of vehicles are assumed to be 2.0 % and 7.5%, respectively. In the second scenario, the average number of vehicles per capita is assumed to reach 0.60, which represents approximately two and a half‐fold increase in fifteen years. The results obtained from both scenarios reveal the suitability of the current methods for road safety applications.


2021 ◽  
Author(s):  
Christian Zeman ◽  
Christoph Schär

<p>Since their first operational application in the 1950s, atmospheric numerical models have become essential tools in weather and climate prediction. As such, they are a constant subject to changes, thanks to advances in computer systems, numerical methods, and the ever increasing knowledge about the atmosphere of Earth. Many of the changes in today's models relate to seemingly unsuspicious modifications, associated with minor code rearrangements, changes in hardware infrastructure, or software upgrades. Such changes are meant to preserve the model formulation, yet the verification of such changes is challenged by the chaotic nature of our atmosphere - any small change, even rounding errors, can have a big impact on individual simulations. Overall this represents a serious challenge to a consistent model development and maintenance framework.</p><p>Here we propose a new methodology for quantifying and verifying the impacts of minor atmospheric model changes, or its underlying hardware/software system, by using ensemble simulations in combination with a statistical hypothesis test. The methodology can assess effects of model changes on almost any output variable over time, and can also be used with different hypothesis tests.</p><p>We present first applications of the methodology with the regional weather and climate model COSMO. The changes considered include a major system upgrade of the supercomputer used, the change from double to single precision floating-point representation, changes in the update frequency of the lateral boundary conditions, and tiny changes to selected model parameters. While providing very robust results, the methodology also shows a large sensitivity to more significant model changes, making it a good candidate for an automated tool to guarantee model consistency in the development cycle.</p>


Sign in / Sign up

Export Citation Format

Share Document