scholarly journals Classical and Bayesian Inference for a Progressive First-Failure Censored Left-Truncated Normal Distribution

Symmetry ◽  
2021 ◽  
Vol 13 (3) ◽  
pp. 490
Author(s):  
Yuxin Cai ◽  
Wenhao Gui

Point and interval estimations are taken into account for a progressive first-failure censored left-truncated normal distribution in this paper. First, we derive the estimators for parameters on account of the maximum likelihood principle. Subsequently, we construct the asymptotic confidence intervals based on these estimates and the log-transformed estimates using the asymptotic normality of maximum likelihood estimators. Meanwhile, bootstrap methods are also proposed for the construction of confidence intervals. As for Bayesian estimation, we implement the Lindley approximation method to determine the Bayesian estimates under not only symmetric loss function but also asymmetric loss functions. The importance sampling procedure is applied at the same time, and the highest posterior density (HPD) credible intervals are established in this procedure. The efficiencies of classical statistical and Bayesian inference methods are evaluated through numerous simulations. We conclude that the Bayes estimates given by Lindley approximation under Linex loss function are highly recommended and HPD interval possesses the narrowest interval length among the proposed intervals. Ultimately, we introduce an authentic dataset describing the tensile strength of 50mm carbon fibers as an illustrative sample.

Mathematics ◽  
2020 ◽  
Vol 9 (1) ◽  
pp. 49
Author(s):  
Siqi Chen ◽  
Wenhao Gui

In reality, estimations for the unknown parameters of truncated distribution with censored data have wide utilization. Truncated normal distribution is more suitable to fit lifetime data compared with normal distribution. This article makes statistical inferences on estimating parameters under truncated normal distribution using adaptive progressive type II censored data. First, the estimates are calculated through exploiting maximum likelihood method. The observed and expected Fisher information matrices are derived to establish the asymptotic confidence intervals. Second, Bayesian estimations under three loss functions are also studied. The point estimates are calculated by Lindley approximation. Importance sampling technique is applied to discuss the Bayes estimates and build the associated highest posterior density credible intervals. Bootstrap confidence intervals are constructed for the purpose of comparison. Monte Carlo simulations and data analysis are employed to present the performances of various methods. Finally, we obtain optimal censoring schemes under different criteria.


2020 ◽  
Author(s):  
Justin R. Williams ◽  
Hyung-Woo Kim ◽  
Catherine M. Crespi

Abstract Background: When data are collected subject to a detection limit, observations below the detection limit may be considered censored. In addition, the domain of such observations may be restricted; for example, values may be required to be non-negative. Methods: We propose a method for estimating population mean and variance from censored observations that accounts for known domain restriction. The method finds maximum likelihood estimates assuming an underlying truncated normal distribution. Results: We show that our method, tcensReg, has lower bias, Type I error rates, and mean squared error than other methods commonly used for data with detection limits such as Tobit regression and single imputation under a range of simulation settings from mild to heavy censoring and truncation. We further demonstrate the consistency of the maximum likelihood estimators. We apply our method to analyze vision quality data collected from ophthalmology clinical trials comparing different types of intraocular lenses implanted during cataract surgery. All of the methods yield similar conclusions regarding non-inferiority, but estimates from the tcensReg method suggest that there may be greater mean differences and overall variability. Conclusions: In the presence of detection limits, our new method tcensReg provides a way to incorporate known domain restrictions in dependent variables that substantially improves inferences.


Author(s):  
Thanoon Y. Thanoon ◽  
Athar Talal Hamed ◽  
Robiah Adnan

The purpose of this paper is to develop a latent variable model with nonlinear covariates and latent variables. Mixed ordered categorical and dichotomous variables and covariates with two different types of thresholds (with equal and unequal spaces) are used in Bayesian multi-sample nonlinear latent variable models and the Gibbs sampling method is applied for estimation and model comparison. Hidden continuous normal distribution (censored normal distribution) and (truncated normal distribution with known parameters) are used to handle the problem of mixed ordered categorical and dichotomous data. Hidden continuous normal distribution (truncated normal distribution with known parameters) is used to handle the problem of mixed ordered categorical and dichotomous data in covariates. Statistical analysis, which involves the estimation of parameters, standard deviations and their highest posterior density, are discussed. The proposed procedure is illustrated using psychological data with the results obtained from the OpenBUGS program.


Entropy ◽  
2021 ◽  
Vol 23 (2) ◽  
pp. 186
Author(s):  
Xinyi Zeng ◽  
Wenhao Gui

In this paper, the parameter estimation problem of a truncated normal distribution is discussed based on the generalized progressive hybrid censored data. The desired maximum likelihood estimates of unknown quantities are firstly derived through the Newton–Raphson algorithm and the expectation maximization algorithm. Based on the asymptotic normality of the maximum likelihood estimators, we develop the asymptotic confidence intervals. The percentile bootstrap method is also employed in the case of the small sample size. Further, the Bayes estimates are evaluated under various loss functions like squared error, general entropy, and linex loss functions. Tierney and Kadane approximation, as well as the importance sampling approach, is applied to obtain the Bayesian estimates under proper prior distributions. The associated Bayesian credible intervals are constructed in the meantime. Extensive numerical simulations are implemented to compare the performance of different estimation methods. Finally, an authentic example is analyzed to illustrate the inference approaches.


2017 ◽  
Vol 928 (10) ◽  
pp. 58-63 ◽  
Author(s):  
V.I. Salnikov

The initial subject for study are consistent sums of the measurement errors. It is assumed that the latter are subject to the normal law, but with the limitation on the value of the marginal error Δpred = 2m. It is known that each amount ni corresponding to a confidence interval, which provides the value of the sum, is equal to zero. The paradox is that the probability of such an event is zero; therefore, it is impossible to determine the value ni of where the sum becomes zero. The article proposes to consider the event consisting in the fact that some amount of error will change value within 2m limits with a confidence level of 0,954. Within the group all the sums have a limit error. These tolerances are proposed to use for the discrepancies in geodesy instead of 2m*SQL(ni). The concept of “the law of the truncated normal distribution with Δpred = 2m” is suggested to be introduced.


Sign in / Sign up

Export Citation Format

Share Document