Properties of the maximum -likelihood estimator for independent random variables

2009 ◽  
Vol 388 (17) ◽  
pp. 3399-3412 ◽  
Author(s):  
Yoshihiko Hasegawa ◽  
Masanori Arita
2014 ◽  
Vol 24 (2) ◽  
pp. 283-291 ◽  
Author(s):  
Milan Jovanovic ◽  
Vesna Rajic

In this paper, we estimate probability P{X < Y} when X and Y are two independent random variables from gamma and exponential distribution, respectively. We obtain maximum likelihood estimator and its asymptotic distribution. We perform some simulation study.


1995 ◽  
Vol 11 (3) ◽  
pp. 437-483 ◽  
Author(s):  
Lung-Fei Lee

In this article, we investigate a bias in an asymptotic expansion of the simulated maximum likelihood estimator introduced by Lerman and Manski (pp. 305–319 in C. Manski and D. McFadden (eds.), Structural Analysis of Discrete Data with Econometric Applications, Cambridge: MIT Press, 1981) for the estimation of discrete choice models. This bias occurs due to the nonlinearity of the derivatives of the log likelihood function and the statistically independent simulation errors of the choice probabilities across observations. This bias can be the dominating bias in an asymptotic expansion of the simulated maximum likelihood estimator when the number of simulated random variables per observation does not increase at least as fast as the sample size. The properly normalized simulated maximum likelihood estimator even has an asymptotic bias in its limiting distribution if the number of simulated random variables increases only as fast as the square root of the sample size. A bias-adjustment is introduced that can reduce the bias. Some Monte Carlo experiments have demonstrated the usefulness of the bias-adjustment procedure.


Author(s):  
Hazim Mansour Gorgees ◽  
Bushra Abdualrasool Ali ◽  
Raghad Ibrahim Kathum

     In this paper, the maximum likelihood estimator and the Bayes estimator of the reliability function for negative exponential distribution has been derived, then a Monte –Carlo simulation technique was employed to compare the performance of such estimators. The integral mean square error (IMSE) was used as a criterion for this comparison. The simulation results displayed that the Bayes estimator performed better than the maximum likelihood estimator for different samples sizes.


2021 ◽  
Author(s):  
Jakob Raymaekers ◽  
Peter J. Rousseeuw

AbstractMany real data sets contain numerical features (variables) whose distribution is far from normal (Gaussian). Instead, their distribution is often skewed. In order to handle such data it is customary to preprocess the variables to make them more normal. The Box–Cox and Yeo–Johnson transformations are well-known tools for this. However, the standard maximum likelihood estimator of their transformation parameter is highly sensitive to outliers, and will often try to move outliers inward at the expense of the normality of the central part of the data. We propose a modification of these transformations as well as an estimator of the transformation parameter that is robust to outliers, so the transformed data can be approximately normal in the center and a few outliers may deviate from it. It compares favorably to existing techniques in an extensive simulation study and on real data.


2013 ◽  
Vol 55 (3) ◽  
pp. 643-652
Author(s):  
Gauss M. Cordeiro ◽  
Denise A. Botter ◽  
Alexsandro B. Cavalcanti ◽  
Lúcia P. Barroso

2020 ◽  
Vol 28 (3) ◽  
pp. 183-196
Author(s):  
Kouacou Tanoh ◽  
Modeste N’zi ◽  
Armel Fabrice Yodé

AbstractWe are interested in bounds on the large deviations probability and Berry–Esseen type inequalities for maximum likelihood estimator and Bayes estimator of the parameter appearing linearly in the drift of nonhomogeneous stochastic differential equation driven by fractional Brownian motion.


Sign in / Sign up

Export Citation Format

Share Document