scholarly journals Parameter Estimation of a Delay Time Model of Wearing Parts Based on Objective Data

2015 ◽  
Vol 2015 ◽  
pp. 1-8 ◽  
Author(s):  
Y. Tang ◽  
J. J. Jing ◽  
Y. Yang ◽  
C. Xie

The wearing parts of a system have a very high failure frequency, making it necessary to carry out continual functional inspections and maintenance to protect the system from unscheduled downtime. This allows for the collection of a large amount of maintenance data. Taking the unique characteristics of the wearing parts into consideration, we establish their respective delay time models in ideal inspection cases and nonideal inspection cases. The model parameters are estimated entirely using the collected maintenance data. Then, a likelihood function of all renewal events is derived based on their occurring probability functions, and the model parameters are calculated with the maximum likelihood function method, which is solved by the CRM. Finally, using two wearing parts from the oil and gas drilling industry as examples—the filter element and the blowout preventer rubber core—the parameters of the distribution function of the initial failure time and the delay time for each example are estimated, and their distribution functions are obtained. Such parameter estimation based on objective data will contribute to the optimization of the reasonable function inspection interval and will also provide some theoretical models to support the integrity management of equipment or systems.

Biometrika ◽  
2019 ◽  
Author(s):  
S Yang ◽  
K Pieper ◽  
F Cools

Summary Structural failure time models are causal models for estimating the effect of time-varying treatments on a survival outcome. G-estimation and artificial censoring have been proposed for estimating the model parameters in the presence of time-dependent confounding and administrative censoring. However, most existing methods require manually pre-processing data into regularly spaced data, which may invalidate the subsequent causal analysis. Moreover, the computation and inference are challenging due to the nonsmoothness of artificial censoring. We propose a class of continuous-time structural failure time models that respects the continuous-time nature of the underlying data processes. Under a martingale condition of no unmeasured confounding, we show that the model parameters are identifiable from a potentially infinite number of estimating equations. Using the semiparametric efficiency theory, we derive the first semiparametric doubly robust estimators, which are consistent if the model for the treatment process or the failure time model, but not necessarily both, is correctly specified. Moreover, we propose using inverse probability of censoring weighting to deal with dependent censoring. In contrast to artificial censoring, our weighting strategy does not introduce nonsmoothness in estimation and ensures that resampling methods can be used for inference.


Biostatistics ◽  
2019 ◽  
Author(s):  
Aaron J Molstad ◽  
Li Hsu ◽  
Wei Sun

SummaryPredicting the survival time of a cancer patient based on his/her genome-wide gene expression remains a challenging problem. For certain types of cancer, the effects of gene expression on survival are both weak and abundant, so identifying non-zero effects with reasonable accuracy is difficult. As an alternative to methods that use variable selection, we propose a Gaussian process accelerated failure time model to predict survival time using genome-wide or pathway-wide gene expression data. Using a Monte Carlo expectation–maximization algorithm, we jointly impute censored log-survival time and estimate model parameters. We demonstrate the performance of our method and its advantage over existing methods in both simulations and real data analysis. The real data that we analyze were collected from 513 patients with kidney renal clear cell carcinoma and include survival time, demographic/clinical variables, and expression of more than 20 000 genes. In addition to the right-censored survival time, our method can also accommodate left-censored or interval-censored outcomes; and it provides a natural way to combine multiple types of high-dimensional -omics data. An R package implementing our method is available in the Supplementary material available at Biostatistics online.


2020 ◽  
Vol 499 (4) ◽  
pp. 5257-5268 ◽  
Author(s):  
Marcos Pellejero-Ibañez ◽  
Raul E Angulo ◽  
Giovanni Aricó ◽  
Matteo Zennaro ◽  
Sergio Contreras ◽  
...  

ABSTRACT The interpretation of cosmological observables requires the use of increasingly sophisticated theoretical models. Since these models are becoming computationally very expensive and display non-trivial uncertainties, the use of standard Bayesian algorithms for cosmological inferences, such as Markov chain Monte Carlo (MCMC), might become inadequate. Here, we propose a new approach to parameter estimation based on an iterative Gaussian emulation of the target likelihood function. This requires a minimal number of likelihood evaluations and naturally accommodates for stochasticity in theoretical models. We apply the algorithm to estimate 9 parameters from the monopole and quadrupole of a mock power spectrum in redshift space. We obtain accurate posterior distribution functions with approximately 100 times fewer likelihood evaluations than an affine invariant MCMC, roughly independently from the dimensionality of the problem. We anticipate that our parameter estimation algorithm will accelerate the adoption of more accurate theoretical models in data analysis, enabling more comprehensive exploitation of cosmological observables.


2021 ◽  
pp. 1-8
Author(s):  
Dali Chen ◽  
Xianglai Chen ◽  
Jingjing Wang ◽  
Zuxin Zhang ◽  
Yan Wang ◽  
...  

Abstract Thermal time models have been widely applied to predict temperature requirements for seed germination. Generally, a log-normal distribution for thermal time [θT(g)] is used in such models at suboptimal temperatures to examine the variation in time to germination arising from variation in θT(g) within a seed population. Recently, additional distribution functions have been used in thermal time models to predict seed germination dynamics. However, the most suitable kind of the distribution function to use in thermal time models, especially at suboptimal temperatures, has not been determined. Five distributions (log-normal, Gumbel, logistic, Weibull and log-logistic) were used in thermal time models over a range of temperatures to fit the germination data for 15 species. The results showed that a more flexible model with the log-logistic distribution, rather than the log-normal distribution, provided the best explanation of θT(g) variation in 13 species at suboptimal temperatures. Thus, at least at suboptimal temperatures, the log-logistic distribution is an appropriate candidate among the five distributions used in this study. Therefore, the distribution of parameters [θT(g)] should be considered when using thermal time models to prevent large deviations; furthermore, an appropriate equation should be selected before using such a model to make predictions.


Entropy ◽  
2021 ◽  
Vol 23 (4) ◽  
pp. 387
Author(s):  
Yiting Liang ◽  
Yuanhua Zhang ◽  
Yonggang Li

A mechanistic kinetic model of cobalt–hydrogen electrochemical competition for the cobalt removal process in zinc hydrometallurgical was proposed. In addition, to overcome the parameter estimation difficulties arising from the model nonlinearities and the lack of information on the possible value ranges of parameters to be estimated, a constrained guided parameter estimation scheme was derived based on model equations and experimental data. The proposed model and the parameter estimation scheme have two advantages: (i) The model reflected for the first time the mechanism of the electrochemical competition between cobalt and hydrogen ions in the process of cobalt removal in zinc hydrometallurgy; (ii) The proposed constrained parameter estimation scheme did not depend on the information of the possible value ranges of parameters to be estimated; (iii) the constraint conditions provided in that scheme directly linked the experimental phenomenon metrics to the model parameters thereby providing deeper insights into the model parameters for model users. Numerical experiments showed that the proposed constrained parameter estimation algorithm significantly improved the estimation efficiency. Meanwhile, the proposed cobalt–hydrogen electrochemical competition model allowed for accurate simulation of the impact of hydrogen ions on cobalt removal rate as well as simulation of the trend of hydrogen ion concentration, which would be helpful for the actual cobalt removal process in zinc hydrometallurgy.


2020 ◽  
Vol 0 (0) ◽  
Author(s):  
Moumita Chatterjee ◽  
Sugata Sen Roy

AbstractIn this article, we model alternately occurring recurrent events and study the effects of covariates on each of the survival times. This is done through the accelerated failure time models, where we use lagged event times to capture the dependence over both the cycles and the two events. However, since the errors of the two regression models are likely to be correlated, we assume a bivariate error distribution. Since most event time distributions do not readily extend to bivariate forms, we take recourse to copula functions to build up the bivariate distributions from the marginals. The model parameters are then estimated using the maximum likelihood method and the properties of the estimators studied. A data on respiratory disease is used to illustrate the technique. A simulation study is also conducted to check for consistency.


2017 ◽  
Vol 65 (4) ◽  
pp. 479-488 ◽  
Author(s):  
A. Boboń ◽  
A. Nocoń ◽  
S. Paszek ◽  
P. Pruski

AbstractThe paper presents a method for determining electromagnetic parameters of different synchronous generator models based on dynamic waveforms measured at power rejection. Such a test can be performed safely under normal operating conditions of a generator working in a power plant. A generator model was investigated, expressed by reactances and time constants of steady, transient, and subtransient state in the d and q axes, as well as the circuit models (type (3,3) and (2,2)) expressed by resistances and inductances of stator, excitation, and equivalent rotor damping circuits windings. All these models approximately take into account the influence of magnetic core saturation. The least squares method was used for parameter estimation. There was minimized the objective function defined as the mean square error between the measured waveforms and the waveforms calculated based on the mathematical models. A method of determining the initial values of those state variables which also depend on the searched parameters is presented. To minimize the objective function, a gradient optimization algorithm finding local minima for a selected starting point was used. To get closer to the global minimum, calculations were repeated many times, taking into account the inequality constraints for the searched parameters. The paper presents the parameter estimation results and a comparison of the waveforms measured and calculated based on the final parameters for 200 MW and 50 MW turbogenerators.


Sign in / Sign up

Export Citation Format

Share Document