scholarly journals An Approximation Method for a Maximum Likelihood Equation System and Application to the Analysis of Accidents Data

2017 ◽  
Vol 07 (01) ◽  
pp. 132-152
Author(s):  
Assi N’Guessan ◽  
Issa Cherif Geraldo ◽  
Bezza Hafidi
Entropy ◽  
2019 ◽  
Vol 21 (6) ◽  
pp. 596
Author(s):  
Antonio Calcagnì ◽  
Livio Finos ◽  
Gianmarco Altoé ◽  
Massimiliano Pastore

In this article, we provide initial findings regarding the problem of solving likelihood equations by means of a maximum entropy (ME) approach. Unlike standard procedures that require equating the score function of the maximum likelihood problem at zero, we propose an alternative strategy where the score is instead used as an external informative constraint to the maximization of the convex Shannon’s entropy function. The problem involves the reparameterization of the score parameters as expected values of discrete probability distributions where probabilities need to be estimated. This leads to a simpler situation where parameters are searched in smaller (hyper) simplex space. We assessed our proposal by means of empirical case studies and a simulation study, the latter involving the most critical case of logistic regression under data separation. The results suggested that the maximum entropy reformulation of the score problem solves the likelihood equation problem. Similarly, when maximum likelihood estimation is difficult, as is the case of logistic regression under separation, the maximum entropy proposal achieved results (numerically) comparable to those obtained by the Firth’s bias-corrected approach. Overall, these first findings reveal that a maximum entropy solution can be considered as an alternative technique to solve the likelihood equation.


1986 ◽  
Vol 2 (1) ◽  
pp. 1-32 ◽  
Author(s):  
T. W. Anderson ◽  
Naoto Kunitomo ◽  
Kimio Morimune

Comparisons of estimators are made on the basis of their mean squared errors and their concentrations of probability computed by means of asymptotic expansions of their distributions when the disturbance variance tends to zero and alternatively when the sample size increases indefinitely. The estimators include k-class estimators (limited information maximum likelihood, two-stage least squares, and ordinary least squares) and linear combinations of them as well as modifications of the limited information maximum likelihood estimator and several Bayes' estimators. Many inequalities between the asymptotic mean squared errors and concentrations of probability are given. Among medianunbiasedestimators, the limited information maximum likelihood estimator dominates the median-unbiased fixed k-class estimator.


1972 ◽  
Vol 9 (01) ◽  
pp. 32-42
Author(s):  
John P. Mullooly

In this paper we derive the probability distributions of the number of molecules and of the lifetime of a molecule in a stochastic rth-order system by direct evaluation of probabilities, avoiding the use of differential-difference equations. Maximum likelihood estimation of the rate constant is based on an observation of the level of the system at time t > 0. We find the asymptotic solution of the likelihood equation for a large initial number of molecules. By comparison with the numerical solution of the likelihood equation, the asymptotic estimator is shown to be a satisfactory approximation for second order reactions which are far from completion.


2008 ◽  
Vol 52 (3) ◽  
pp. 1315-1322 ◽  
Author(s):  
Nicolas Wicker ◽  
Jean Muller ◽  
Ravi Kiran Reddy Kalathur ◽  
Olivier Poch

Sign in / Sign up

Export Citation Format

Share Document