Effects of measurement error on catch–-effort estimation

1997 ◽  
Vol 54 (4) ◽  
pp. 898-906
Author(s):  
W R Gould ◽  
L A Stefanski ◽  
K H Pollock

We have investigated the consequences of using imprecise catch and effort estimates in closed-population catch-effort analyses using traditional regression techniques and maximum likelihood to estimate the catchability coefficient and population size parameters. Our simulation study involved adding known amounts of measurement error to error-free catch and effort data to determine the effects of using such estimates of catch and effort rather than the true, and in many cases unknown, quantities. Our results suggest that naive estimation using estimates of catch and effort as true values may bias estimates of population size and the catchability coefficient. In most cases, the effects of measurement error in catch and effort were to inflate the parameter estimates, the magnitude of inflation being dependent on the size of the measurement error variance. Maximum likelihood estimation proved to be the estimation procedure most robust to the errors in measurement, but still displayed the need for correction of the measurement-error-induced bias. A recently developed simulation-extrapolation method of inference (J.R. Cook and L.A. Stefanski. 1994. J. Am. Stat. Assoc. 89: 1314-1328) is suggested as a possible means for making bias adjustments.

1999 ◽  
Vol 56 (7) ◽  
pp. 1234-1240
Author(s):  
W R Gould ◽  
L A Stefanski ◽  
K H Pollock

All catch-effort estimation methods implicitly assume catch and effort are known quantities, whereas in many cases, they have been estimated and are subject to error. We evaluate the application of a simulation-based estimation procedure for measurement error models (J.R. Cook and L.A. Stefanski. 1994. J. Am. Stat. Assoc. 89: 1314-1328) in catch-effort studies. The technique involves a simulation component and an extrapolation step, hence the name SIMEX estimation. We describe SIMEX estimation in general terms and illustrate its use with applications to real and simulated catch and effort data. Correcting for measurement error with SIMEX estimation resulted in population size and catchability coefficient estimates that were substantially less than naive estimates, which ignored measurement errors in some cases. In a simulation of the procedure, we compared estimators from SIMEX with "naive" estimators that ignore measurement errors in catch and effort to determine the ability of SIMEX to produce bias-corrected estimates. The SIMEX estimators were less biased than the naive estimators but in some cases were also more variable. Despite the bias reduction, the SIMEX estimator had a larger mean squared error than the naive estimator for one of two artificial populations studied. However, our results suggest the SIMEX estimator may outperform the naive estimator in terms of bias and precision for larger populations.


Methodology ◽  
2005 ◽  
Vol 1 (2) ◽  
pp. 81-85 ◽  
Author(s):  
Stefan C. Schmukle ◽  
Jochen Hardt

Abstract. Incremental fit indices (IFIs) are regularly used when assessing the fit of structural equation models. IFIs are based on the comparison of the fit of a target model with that of a null model. For maximum-likelihood estimation, IFIs are usually computed by using the χ2 statistics of the maximum-likelihood fitting function (ML-χ2). However, LISREL recently changed the computation of IFIs. Since version 8.52, IFIs reported by LISREL are based on the χ2 statistics of the reweighted least squares fitting function (RLS-χ2). Although both functions lead to the same maximum-likelihood parameter estimates, the two χ2 statistics reach different values. Because these differences are especially large for null models, IFIs are affected in particular. Consequently, RLS-χ2 based IFIs in combination with conventional cut-off values explored for ML-χ2 based IFIs may lead to a wrong acceptance of models. We demonstrate this point by a confirmatory factor analysis in a sample of 2449 subjects.


Symmetry ◽  
2020 ◽  
Vol 12 (5) ◽  
pp. 813
Author(s):  
Anita Rahayu ◽  
Purhadi ◽  
Sutikno ◽  
Dedy Dwi Prastyo

Gamma distribution is a general type of statistical distribution that can be applied in various fields, mainly when the distribution of data is not symmetrical. When predictor variables also affect positive outcome, then gamma regression plays a role. In many cases, the predictor variables give effect to several responses simultaneously. In this article, we develop a multivariate gamma regression (MGR), which is one type of non-linear regression with response variables that follow a multivariate gamma (MG) distribution. This work also provides the parameter estimation procedure, test statistics, and hypothesis testing for the significance of the parameter, partially and simultaneously. The parameter estimators are obtained using the maximum likelihood estimation (MLE) that is optimized by numerical iteration using the Berndt–Hall–Hall–Hausman (BHHH) algorithm. The simultaneous test for the model’s significance is derived using the maximum likelihood ratio test (MLRT), whereas the partial test uses the Wald test. The proposed MGR model is applied to model the three dimensions of the human development index (HDI) with five predictor variables. The unit of observation is regency/municipality in Java, Indonesia, in 2018. The empirical results show that modeling using multiple predictors makes more sense compared to the model when it only employs a single predictor.


2010 ◽  
Vol 26 (6) ◽  
pp. 1846-1854 ◽  
Author(s):  
Mogens Fosgerau ◽  
Søren Feodor Nielsen

In many stated choice experiments researchers observe the random variablesVt,Xt, andYt= 1{U+δ⊤Xt+ εt<Vt},t≤T, whereδis an unknown parameter andUand εtare unobservable random variables. We show that under weak assumptions the distributions ofUand εtand also the unknown parameterδcan be consistently estimated using a sieved maximum likelihood estimation procedure.


2018 ◽  
Vol 7 (3) ◽  
pp. 651-659 ◽  
Author(s):  
Florian M. Hollenbach ◽  
Jacob M. Montgomery ◽  
Adriana Crespo-Tenorio

Bivariate probit models are a common choice for scholars wishing to estimate causal effects in instrumental variable models where both the treatment and outcome are binary. However, standard maximum likelihood approaches for estimating bivariate probit models are problematic. Numerical routines in popular software suites frequently generate inaccurate parameter estimates and even estimated correctly, maximum likelihood routines provide no straightforward way to produce estimates of uncertainty for causal quantities of interest. In this note, we show that adopting a Bayesian approach provides more accurate estimates of key parameters and facilitates the direct calculation of causal quantities along with their attendant measures of uncertainty.


2019 ◽  
Vol 2019 ◽  
pp. 1-8 ◽  
Author(s):  
Fan Yang ◽  
Hu Ren ◽  
Zhili Hu

The maximum likelihood estimation is a widely used approach to the parameter estimation. However, the conventional algorithm makes the estimation procedure of three-parameter Weibull distribution difficult. Therefore, this paper proposes an evolutionary strategy to explore the good solutions based on the maximum likelihood method. The maximizing process of likelihood function is converted to an optimization problem. The evolutionary algorithm is employed to obtain the optimal parameters for the likelihood function. Examples are presented to demonstrate the proposed method. The results show that the proposed method is suitable for the parameter estimation of the three-parameter Weibull distribution.


1983 ◽  
Vol 40 (12) ◽  
pp. 2153-2169 ◽  
Author(s):  
Jon Schnute

This paper presents a new approach to the use of removal data in estimating the size of a population of fish or other animals. The theory admits a variety of assumptions on how catchability varies among fishings including the assumption of constant catchability, which underlies most previous work. The methods here hinge on maximum likelihood estimation, and they can be used both to decide objectively if the data justify rejecting constant catchability and to determine confidence intervals for the parameters. The work includes a new method of assigning confidence to the population estimate and points out problems with methods currently available in the literature, even in the case of constant catchability. The theory is applied both to data in historical literature and to more recent data from streams in New Brunswick, Canada. These examples demonstrate that the assumption of constant catchability can frequently lead to serious errors in data interpretation. In some cases, the conclusion that the population size is well known may be blatantly false, and reasonable estimates may be impossible without further data.


2021 ◽  
Author(s):  
Jan Steinfeld ◽  
Alexander Robitzsch

This article describes the conditional maximum likelihood-based item parameter estimation in probabilistic multistage designs. In probabilistic multistage designs, the routing is not solely based on a raw score j and a cut score c as well as a rule for routing into a module such as j &lt; c or j ≤ c but is based on a probability p(j) for each raw score j. It can be shown that the use of a conventional conditional maximum likelihood parameter estimate in multistage designs leads to severely biased item parameter estimates. Zwitser and Maris (2013) were able to show that with deterministic routing, the integration of the design into the item parameter estimation leads to unbiased estimates. This article extends this approach to probabilistic routing and, at the same time, represents a generalization. In a simulation study, it is shown that the item parameter estimation in probabilistic designs leads to unbiased item parameter estimates.


Sign in / Sign up

Export Citation Format

Share Document