Parameter Estimation for the Constrained Gravity Model: A Comparison of Six Methods

1976 ◽  
Vol 8 (6) ◽  
pp. 673-683 ◽  
Author(s):  
F Stetzer

Six methods of parameter estimation for the production-constrained gravity model are compared in the context of interurban consumer travel. Monte-Carlo experiments reveal that the nonlinear methods (Batty and Mackie, 1972) are inferior to the linear methods (Nakanishi and Cooper, 1974) when there is specification error present, but that the latter rapidly lose their advantage as sample sizes decrease. Bias in the parameter estimates is a more serious source of error than sampling variation.

Author(s):  
Peter Green ◽  
Simon Maskell

In this paper the authors present a method which facilitates computationally efficient parameter estimation of dynamical systems from a continuously growing set of measurement data. It is shown that the proposed method, which utilises Sequential Monte Carlo samplers, is guaranteed to be fully parallelisable (in contrast to Markov chain Monte Carlo methods) and can be applied to a wide variety of scenarios within structural dynamics. Its ability to allow convergence of one's parameter estimates, as more data is analysed, sets it apart from other sequential methods (such as the particle filter).


1983 ◽  
Vol 245 (2) ◽  
pp. R135-R142
Author(s):  
E. M. Haacke ◽  
M. D. Goldman

We present the application of a weighted least-squares technique to extract parameter estimates in linear models when all variables are subject to error and the goal of the investigation is the value of the parameters themselves. We assume that the relative variances of the variables are known and that the errors between variables are independent. The method of parameter estimation for linear functional relationships is presented, and we describe its differences from linear regression. We discuss how to obtain confidence intervals for the parameter estimates with an emphasis on computer Monte Carlo simulations. An explicit example related to measurements of lung volume changes is presented. An eigenvalue analysis of the data pertaining to the number of independent variables and a physical interpretation of the data space are also discussed.


1990 ◽  
Vol 47 (3) ◽  
pp. 516-519 ◽  
Author(s):  
Carl J. Walters

Stock–recruitment time series often give a distorted picture of average recruitment rates, with high productivities per spawner being overrepresented at low stock sizes. This distortion is exaggerated by autocorrelation among years in environmental effects on productivity. The common procedure of fitting a stock–recruit curve and then analysing residuals from the curve will result in a substantial underestimate of the autocorrelation among environmental effects. Previous studies have recommended using Monte Carlo simulations to estimate the bias in stock–recruit model parameter estimates. These simulations can generally be avoided by using a simple correction equation. However, deviations from the corrected stock–recruit curve will not give better estimates of autocorrelation patterns in environmental effects, and hence will not help to provide better forecasts and stronger tests for factors that may be causing the effects.


Mathematics ◽  
2021 ◽  
Vol 9 (8) ◽  
pp. 817
Author(s):  
Fernando López ◽  
Mariano Matilla-García ◽  
Jesús Mur ◽  
Manuel Ruiz Marín

A novel general method for constructing nonparametric hypotheses tests based on the field of symbolic analysis is introduced in this paper. Several existing tests based on symbolic entropy that have been used for testing central hypotheses in several branches of science (particularly in economics and statistics) are particular cases of this general approach. This family of symbolic tests uses few assumptions, which increases the general applicability of any symbolic-based test. Additionally, as a theoretical application of this method, we construct and put forward four new statistics to test for the null hypothesis of spatiotemporal independence. There are very few tests in the specialized literature in this regard. The new tests were evaluated with the mean of several Monte Carlo experiments. The results highlight the outstanding performance of the proposed test.


2008 ◽  
Vol 10 (2) ◽  
pp. 153-162 ◽  
Author(s):  
B. G. Ruessink

When a numerical model is to be used as a practical tool, its parameters should preferably be stable and consistent, that is, possess a small uncertainty and be time-invariant. Using data and predictions of alongshore mean currents flowing on a beach as a case study, this paper illustrates how parameter stability and consistency can be assessed using Markov chain Monte Carlo. Within a single calibration run, Markov chain Monte Carlo estimates the parameter posterior probability density function, its mode being the best-fit parameter set. Parameter stability is investigated by stepwise adding new data to a calibration run, while consistency is examined by calibrating the model on different datasets of equal length. The results for the present case study indicate that various tidal cycles with strong (say, >0.5 m/s) currents are required to obtain stable parameter estimates, and that the best-fit model parameters and the underlying posterior distribution are strongly time-varying. This inconsistent parameter behavior may reflect unresolved variability of the processes represented by the parameters, or may represent compensational behavior for temporal violations in specific model assumptions.


Sign in / Sign up

Export Citation Format

Share Document