Statistical Aspects of Lifetime in the Presence of UV Radiation

2003 ◽  
Vol 125 (1) ◽  
pp. 1-3 ◽  
Author(s):  
Michael I. Zeifman ◽  
Dov Ingman

The reliability function of a component cannot be satisfactorily estimated from experiments, because: (i) an accurate estimation of the lifetime distribution tails, controlling the most important domain of high reliability, requires a very large sample and (ii) reliability tests under normal operational conditions are necessarily very lengthy. Hence the urgent need for a physical model for component lifetime statistics. The paper presents an application of the recently developed model for damage accumulation in polymeric materials to the long-term constant stress rupture experiments on Kevlar Composite, which is widely used in fiber optics. The strong dependence of the experimentally observed distribution shape on the load applied to a component has been previously explained in the framework of two different damage mechanisms: kinetic crack growth and chemical deterioration; the resultant 3-parameter lifetime distribution was predicted to be essentially non-Weibull. The proposed model, based on a single micro-mechanical damage mechanism, leads to a 2-parameter Weibull lifetime distribution with the shape parameter depending on the applied load by a simple inverse power law. Both distribution models were fitted to experimental lifetime data for different stress levels and the corresponding goodness of fit was compared by the usual likelihood ratio test. The proposed model describes the experimental data better — especially in the most important domain of low stress and long (of the order of years) lifetime. The model is physically sound and permits improved design of the accelerated tests and more accurate interpretation of their results, and finally quantitative prediction of the reliability function of the loaded polymeric component.

Mathematics ◽  
2020 ◽  
Vol 8 (8) ◽  
pp. 1366
Author(s):  
Da Hye Lee ◽  
In Hong Chang ◽  
Hoang Pham

Software reliability and quality are crucial in several fields. Related studies have focused on software reliability growth models (SRGMs). Herein, we propose a new SRGM that assumes interdependent software failures. We conduct experiments on real-world datasets to compare the goodness-of-fit of the proposed model with the results of previous nonhomogeneous Poisson process SRGMs using several evaluation criteria. In addition, we determine software reliability using Wald’s sequential probability ratio test (SPRT), which is more efficient than the classical hypothesis test (the latter requires substantially more data and time because the test is performed only after data collection is completed). The experimental results demonstrate the superiority of the proposed model and the effectiveness of the SPRT.


2021 ◽  
Vol 17 (1) ◽  
pp. 5-30
Author(s):  
S. A. Wani ◽  
S. Shafi

Abstract We obtained a new generalization of Lindley-Quasi Xgamma distribution by adding weight parameter to it through weighting technique and have shown the flexibility of proposed model. Expression for reliability measures, order statistics, Bonferroni curves & indices, Renyi entropy along with some other important properties are derived. Maximum likelihood estimation method is put to use for estimation of unknown parameters of proposed model. Simulation study for checking the performance of maximum likelihood estimates and for model comparison is carried out. Proposed model and its related models are fitted to real life data sets and goodness of fit measure Kolmogorov statistic & p-value, loss of information criteria’s AIC, BIC, AICC & HQIC are computed through R software to check the applicability of proposed model in real life. The significance of weight parameter is also tested by using likelihood ratio test for both randomly generated data as well as real life data.


1969 ◽  
Vol 62 (4_Suppla) ◽  
pp. S23-S35
Author(s):  
B.-A. Lamberg ◽  
O. P. Heinonen ◽  
K. Liewendahl ◽  
G. Kvist ◽  
M. Viherkoski ◽  
...  

ABSTRACT The distributions of 13 variables based on 10 laboratory tests measuring thyroid function were studied in euthyroid controls and in patients with toxic diffuse or toxic multinodular goitre. Density functions were fitted to the empirical data and the goodness of fit was evaluated by the use of the χ2-test. In a few instances there was a significant difference but the material available was in some respects too small to allow a very accurate estimation. The normal limits for each variable was defined by the 2.5 and 97.5 percentiles. It appears that in some instances these limits are too rigorous from the practical point of view. It is emphasized that the crossing point of the functions for euthyroid controls and hyperthyroid patients may be a better limit to use. In a preliminary analysis of the diagnostic efficiency the variables of total or free hormone concentration in the blood proved clearily superior to all other variables.


2014 ◽  
Vol 2014 ◽  
pp. 1-11 ◽  
Author(s):  
Kai Xu ◽  
Yiwen Wang ◽  
Fang Wang ◽  
Yuxi Liao ◽  
Qiaosheng Zhang ◽  
...  

Sequential Monte Carlo estimation on point processes has been successfully applied to predict the movement from neural activity. However, there exist some issues along with this method such as the simplified tuning model and the high computational complexity, which may degenerate the decoding performance of motor brain machine interfaces. In this paper, we adopt a general tuning model which takes recent ensemble activity into account. The goodness-of-fit analysis demonstrates that the proposed model can predict the neuronal response more accurately than the one only depending on kinematics. A new sequential Monte Carlo algorithm based on the proposed model is constructed. The algorithm can significantly reduce the root mean square error of decoding results, which decreases 23.6% in position estimation. In addition, we accelerate the decoding speed by implementing the proposed algorithm in a massive parallel manner on GPU. The results demonstrate that the spike trains can be decoded as point process in real time even with 8000 particles or 300 neurons, which is over 10 times faster than the serial implementation. The main contribution of our work is to enable the sequential Monte Carlo algorithm with point process observation to output the movement estimation much faster and more accurately.


2014 ◽  
Vol 2014 ◽  
pp. 1-7 ◽  
Author(s):  
Dinesh Verma ◽  
Shishir Kumar

Nowadays, software developers are facing challenges in minimizing the number of defects during the software development. Using defect density parameter, developers can identify the possibilities of improvements in the product. Since the total number of defects depends on module size, so there is need to calculate the optimal size of the module to minimize the defect density. In this paper, an improved model has been formulated that indicates the relationship between defect density and variable size of modules. This relationship could be used for optimization of overall defect density using an effective distribution of modules sizes. Three available data sets related to concern aspect have been examined with the proposed model by taking the distinct values of variables and parameter by putting some constraint on parameters. Curve fitting method has been used to obtain the size of module with minimum defect density. Goodness of fit measures has been performed to validate the proposed model for data sets. The defect density can be optimized by effective distribution of size of modules. The larger modules can be broken into smaller modules and smaller modules can be merged to minimize the overall defect density.


2013 ◽  
Vol 333-335 ◽  
pp. 787-790
Author(s):  
Shu Qian He ◽  
Zheng Jie Deng ◽  
Chun Shi

Rate estimation is useful for many H.264/AVC applications including rate-distortion optimization (RDO) for fast mode decision and precise rate control. In this paper, we propose a new header rate prediction model and an adaptive algorithm to provide more accurate estimation of the number of total coding bits for rate control compared to previously proposed methods. The header bit rate estimation is modeled by a linear combination of the number of mode block, and the sum of absolute values of all motion vectors for each block. Based on the proposed model, a header rate estimation function is also proposed to give a more accurate rate-distortion rate control. The proposed schemes can achieve better results in rate-distortion and rate control to previously proposed approaches.


2014 ◽  
Vol 2014 ◽  
pp. 1-11 ◽  
Author(s):  
Huibing Hao ◽  
Chun Su

A novel reliability assessment method for degradation product with two dependent performance characteristics (PCs) is proposed, which is different from existing work that only utilized one dimensional degradation data. In this model, the dependence of two PCs is described by the Frank copula function, and each PC is governed by a random effected nonlinear diffusion process where random effects capture the unit to unit differences. Considering that the model is so complicated and analytically intractable, Markov Chain Monte Carlo (MCMC) method is used to estimate the unknown parameters. A numerical example about LED lamp is given to demonstrate the usefulness and validity of the proposed model and method. Numerical results show that the random effected nonlinear diffusion model is very useful by checking the goodness of fit of the real data, and ignoring the dependence between PCs may result in different reliability conclusion.


2021 ◽  
Vol 16 (4) ◽  
pp. 846-858
Author(s):  
Matthias Klumpp ◽  
Dominic Loske

Order picking is a crucial but labor- and cost-intensive activity in the retail logistics and e-commerce domain. Comprehensive changes are implemented in this field due to new technologies like AI and automation. Nevertheless, human worker’s activities will be required for quite some time in the future. This fosters the necessity of evaluating manual picker-to-part operations. We apply the non-parametric Data Envelopment Analysis (DEA) to evaluate the efficiency of n = 23 order pickers processing 6109 batches with 865,410 stock keeping units (SKUs). We use distance per location, picks per location, as well as volume per SKU as inputs and picks per hour as output. As the convexity axiom of standard DEA models cannot be fully satisfied when using ratio measures with different denominators, we apply the Free Disposal Hull (FDH) approach that does not assume convexity. Validating the efficiency scores with the company’s efficiency assessment, operationalized by premium payments shows a 93% goodness=of-fit for the proposed model. The formulated non-parametric approach and its empirical application are promising ways forward in implementing empirical efficiency measurements for order picking operations within e-commerce operations.


2021 ◽  
Author(s):  
Simon Schüppler ◽  
Roman Zorn ◽  
Hagen Steger ◽  
Philipp Blum

<p>The measurement of the undisturbed ground temperature (UGT) serves to design low-temperature geothermal systems, in particular borehole heat exchangers (BHEs), and to monitor shallow aquifers. Wireless and miniaturized probes such as the Geosniff (GS) measurement sphere, which are characterized by an autarkic energy supply and equipped with pressure and temperature sensors, are increasingly being used for the measurement of highly resolved vertical temperature profiles. The measurement probe sinks along the course of the BHE with a selectable measurement frequency to the bottom of the BHE and is useable for initial measurements as well as long term groundwater monitoring. To ensure quality assurance and further improvement of this emerging technology, the analysis of measurement errors and uncertainties of wireless temperature measurements (WTMs) is indispensable. Thus, we provide an empirical laboratory analysis of random, systematic, and dynamic measurement errors, which lead to the measurement uncertainty of WTMs using the GS as a representative device. We subsequently transfer the analysed uncertainty to measured vertical temperature profiles of the undisturbed ground at a BHE site in Karlsruhe, Germany. The precision and accuracy of 0.011 K and -0.11 K, respectively, ensure a high reliability of the GS measurements. The largest measurement uncertainty is obtained within the first five meters of descent resulting from the thermal time constant τ of 4 s. The measured temperature profiles are qualitatively compared with common Distributed Temperature Sensing (DTS) using fiber optic cables and punctual Pt-100 sensors. Wireless probes are also suitable to correct temperature profiles recorded with fiber optics with systematic errors of up to -0.93 K. Various boundary conditions such as the inclination of the BHE pipes or changes of the viscosity and density of the BHE fluid effect the descent rate of the GS of up to 40 %. We additionally provide recommendations for technical implementations of future measurement probes and contribute to an improved understanding and further development of WTMs.</p>


2021 ◽  
Vol 21 (1) ◽  
pp. 26-35
Author(s):  
Gopal Kumar ◽  
Anshuman Shukla ◽  
Amit Chhoker ◽  
Rohit Kumar Thapa

The purpose of this study was to find the factors responsible for winning in the men’s and women’s beach volleyball championship. Materials and methods. The study sample consisted of a total of 212 matches for men and 214 matches for women of the 2017 & 2019 FIVB Men and Women Beach Volleyball World Championships held at Vienna & Hamburg from 28 July to 6 Aug 2017 and 28 June to 7 July 2019. The matches were played by 192 teams (both men and women combined) consisting of 384 numbers (both men and women combined) of players from different nations. The data were analyzed using Binary Logistic Regression (Forward: LR Method) with the result of the game as the dependent variable and predictor variables as covariates. β, standard error β, Wald’s χ2, odds ratio with 95% confidence interval were calculated. Model evaluation was conducted using the likelihood ratio test, Cox & Snell (R2), and Nagelkerke (R2) tests. The goodness of fit test for the models was conducted using the Hosmer & Lemeshow test. Results. The analysis revealed seven factors related to winning in men’s and women’s competition. While in league rounds, six factors in men’s and seven factors in women’s competition were related to winning. Besides, in knockout rounds, four factors in men’s and six factors in women’s competition were related to winning. Conclusion. The study shows that there is a significant association of important factors with respect to winning a match in an elite beach volleyball championship. The coaches and players can take note of the important factors responsible for winning in the elite beach volleyball championship, with different factors playing an important role in men’s and women’s competition during league and knockout rounds as well.


Sign in / Sign up

Export Citation Format

Share Document