scholarly journals A Characterization of the Compound Multiparameter Hermite Gamma Distribution via Gauss’s Principle

2013 ◽  
Vol 2013 ◽  
pp. 1-7 ◽  
Author(s):  
Werner Hürlimann

We consider the class of those distributions that satisfy Gauss's principle (the maximum likelihood estimator of the mean is the sample mean) and have a parameter orthogonal to the mean. It is shown that this so-called “mean orthogonal class” is closed under convolution. A previous characterization of the compound gamma characterization of random sums is revisited and clarified. A new characterization of the compound distribution with multiparameter Hermite count distribution and gamma severity distribution is obtained.

2003 ◽  
Vol 2003 (34) ◽  
pp. 2147-2156 ◽  
Author(s):  
Rasul A. Khan

LetX1,X2,…,Xnbe a random sample from a normalN(θ,σ2)distribution with an unknown meanθ=0,±1,±2,…. Hammersley (1950) proposed the maximum likelihood estimator (MLE)d=[X¯n], nearest integer to the sample mean, as an unbiased estimator ofθand extended the Cramér-Rao inequality. The Hammersley lower bound for the variance of any unbiased estimator ofθis significantly improved, and the asymptotic (asn→∞) limit of Fraser-Guttman-Bhattacharyya bounds is also determined. A limiting property of a suitable distance is used to give some plausible explanations why such bounds cannot be attained. An almost uniformly minimum variance unbiased (UMVU) like property ofdis exhibited.


1980 ◽  
Vol 11 (1) ◽  
pp. 35-40 ◽  
Author(s):  
Peter ter Berg

Maximum likelihood estimation in case of a Poisson or Gamma distribution with loglinear parametrization for the mean is quite akin. The asymptotic variance-covariance matrix for the maximum likelihood estimator is derived as well as a linear estimator, which can serve as a starting value for the nonlinear search procedure.


Author(s):  
Hazim Mansour Gorgees ◽  
Bushra Abdualrasool Ali ◽  
Raghad Ibrahim Kathum

     In this paper, the maximum likelihood estimator and the Bayes estimator of the reliability function for negative exponential distribution has been derived, then a Monte –Carlo simulation technique was employed to compare the performance of such estimators. The integral mean square error (IMSE) was used as a criterion for this comparison. The simulation results displayed that the Bayes estimator performed better than the maximum likelihood estimator for different samples sizes.


2021 ◽  
Author(s):  
Jakob Raymaekers ◽  
Peter J. Rousseeuw

AbstractMany real data sets contain numerical features (variables) whose distribution is far from normal (Gaussian). Instead, their distribution is often skewed. In order to handle such data it is customary to preprocess the variables to make them more normal. The Box–Cox and Yeo–Johnson transformations are well-known tools for this. However, the standard maximum likelihood estimator of their transformation parameter is highly sensitive to outliers, and will often try to move outliers inward at the expense of the normality of the central part of the data. We propose a modification of these transformations as well as an estimator of the transformation parameter that is robust to outliers, so the transformed data can be approximately normal in the center and a few outliers may deviate from it. It compares favorably to existing techniques in an extensive simulation study and on real data.


Sign in / Sign up

Export Citation Format

Share Document