A general solution for a class of weakly constrained linear regression problems

Psychometrika ◽  
1991 ◽  
Vol 56 (4) ◽  
pp. 601-609 ◽  
Author(s):  
Jos M. F. ten Berge
2008 ◽  
Vol 04 (02) ◽  
pp. 123-141 ◽  
Author(s):  
AREEG ABDALLA ◽  
JAMES BUCKLEY

We apply our new fuzzy Monte Carlo method to certain fuzzy non-linear regression problems to estimate the best solution. The best solution is a vector of triangular fuzzy numbers, for the fuzzy coefficients in the model, which minimizes an error measure. We use a quasi-random number generator to produce random sequences of these fuzzy vectors which uniformly fill the search space. We consider example problems to show that this Monte Carlo method obtains solutions comparable to those obtained by an evolutionary algorithm.


1986 ◽  
Vol 62 (1) ◽  
pp. 16-19 ◽  
Author(s):  
P. L. Marshall ◽  
J. P. Demaerschalk

Simple linear regression is widely used in forestry, but often only a vaguely defined strategy for selecting sampling units is followed. Trial and error methods exist for aiding efficient sample allocation for simple linear regression purposes. These methods are computationally tedious and often impractical without the aid of a computer. This paper briefly describes a computerized iterative search procedure that can provide an efficient design for sample allocation in simple linear regression problems with equal or unequal sampling costs and balanced or unbalanced prediction intervals. Potential savings gained by employing an efficient design over other more easily derived but less efficient designs are illustrated by an example. Key words: Simple linear regression, optimal sampling design, iterative search procedure.


2021 ◽  
Vol 4 ◽  
pp. 28-37
Author(s):  
Alexander Nakonechnyi ◽  
◽  
Grigoriy Kudin ◽  
Taras Zinko ◽  
Petr Zinko ◽  
...  

The issues of parameter estimation in linear regression problems with random matrix coefficients were researched. Given that random linear functions are observed from unknown matrices with random errors that have unknown correlation matrices, the problems of guaranteed mean square estimation of linear functions of matrices were investigated. The estimates of the upper and lower guaranteed standard errors of linear estimates of observations of linear functions of matrices were obtained in the case when the sets are found, for which the unknown matrices and correlation matrices of observation errors are known. It was proved that for some partial cases such estimates are accurate. Assuming that the sets are bounded, convex and closed, more accurate two-sided estimates have been gained for guaranteed errors. The conditions when the guaranteed mean squared errors approach zero as the number of observations increases were found. The necessary and sufficient conditions for the unbiasedness of linear estimates of linear functions of matrices were provided. The notion of quasi-optimal estimates for linear functions of matrices was introduced, and it was proved that in the class of unbiased estimates, quasi-optimal estimates exist and are unique. For such estimates, the conditions of convergence to zero of the guaranteed mean-square errors were obtained. Also, for linear estimates of unknown matrices, the concept of quasi-minimax estimates was introduced and it was confirmed that they are unbiased. For special sets, which include an unknown matrix and correlation matrices of observation errors, such estimates were expressed through the solution of linear operator equations in a finite-dimensional space. For quasi-minimax estimates under certain assumptions, the form of the guaranteed mean squared error of the unknown matrix was found. It was shown that such errors are limited by the sum of traces of the known matrices. An example of finding a minimax unbiased linear estimation was given for a special type of random matrices that are included in the observation equation.


1992 ◽  
Vol 17 (1) ◽  
pp. 51-74 ◽  
Author(s):  
Clifford C. Clogg ◽  
Eva Petkova ◽  
Edward S. Shihadeh

We give a unified treatment of statistical methods for assessing collapsibility in regression problems, including some possible extensions to the class of generalized linear models. Terminology is borrowed from the contingency table area where various methods for assessing collapsibility have been proposed. Our procedures, however, can be motivated by considering extensions, and alternative derivations, of common procedures for omitted-variable bias in linear regression. Exact tests and interval estimates with optimal properties are available for linear regression with normal errors, and asymptotic procedures follow for models with estimated weights. The methods given here can be used to compareβ1 and β2 in the common setting where the response function is first modeled asXβ1(reduced model) and then asXβ2+Zγ(full model), withZ a vector of covariates omitted from the reduced model. These procedures can be used in experimental settings (X= randomly asigned treatments,Z= covariates) or in nonexperimental settings where two models viewed as alternative behavioral or structural explanations are compared (one model withX only, another model withX andZ).


Sign in / Sign up

Export Citation Format

Share Document