Stochastic Models and Fluctuations in Reversal Time of Ambiguous Figures

Perception ◽  
1977 ◽  
Vol 6 (6) ◽  
pp. 645-656 ◽  
Author(s):  
Angelo de Marco ◽  
Piero Penengo ◽  
Aurelia Trabucco ◽  
Antonio Borsellino ◽  
Franco Carlini ◽  
...  

Five probability distributions for the description of temporal fluctuations in the perception of ambiguous figures were fitted to previously obtained experimental results and classified according to their efficiency in describing the data. The gamma, Wiener, and Capocelli-Ricciardi distributions showed the highest efficiency, while the χ2 and Taylor-Aldridge distributions showed a very low efficiency. Therefore the underlying process may be described either by a simple Poisson model or by a random-walk model. For the gamma distribution there was a strong correlation between the parameters, while for the Wiener distribution this correlation was lower.

Author(s):  
Christian Gollier

This chapter aims to provide a unified theoretical foundation to the term structure of discount rates. To do this the chapter develops a benchmark model based on two assumptions: individual preferences toward risk, and the nature of the uncertainty over economic growth. Previously, it was shown that constant relative risk aversion, combined with a random walk for the growth of log consumption, yields a flat term structure for efficient discount rates. In this chapter, these two assumptions are relaxed by using a stochastic dominance approach. Stochastic models of economic growth with mean-reversion, Markov switches, and parametric uncertainty all exhibit some forms of positive statistical dependence of successive growth rates. Because this tends to magnify the long-term risk, it is the driving force of the decreasing nature of the term structure.


1979 ◽  
Vol 81 ◽  
pp. 299-301
Author(s):  
Tsuko Nakamura

Original nearly parabolic orbits of comets are known to be evolved toward short-periodic elliptic orbits as statistical results of hundreds of encounters with Jupiter. There seems to be two methods to handle the process, namely, the method by exact numerical integrations for each orbit (Everhart, 1972) and random walk approach by using probability distributions of perturbations after single encounters (Lyttleton and Hammersley, 1963; Shteins, 1972). Since both methods need a great number of input parabolic comets to have only a few tens of short-periodic ones, the second method may save time compared with the first one, which is in turn more accurate. The purpose of this paper is to clarify the characteristics of single-encounter effects, in order to develope the second method more elaborately and extensively.


2018 ◽  
Vol 23 ◽  
pp. 00001
Author(s):  
Katarzyna Baran-Gurgul

Based on 30-year 24-hour flow sequences at 69 water gauging stations in the Upper Vistula catchment, it was determined that the probability distributions of the low flow duration and its maximum annual deficit can be described by the gamma distribution with the estimated parameters by the methods: MOM, the method of moments, LMOM, the method of linear moments, and MLE, the method of maximum likelihood. The stationarity of the time series was tested by the Mann-Kendall correlation using the Hamed and Rao variance correction. The low flows were defined by the SPA method, with the limit flow Q70%. The quality of the match was tested by the Anderson-Darling goodness of fit test. This test allowed accepting the gamma distribution in all analysed cases, regardless of the method used to estimate the distribution parameters, since the pv (p-values) values were greater than 5% (over 18% for Tmax and 7.5% for Vmax). The highest pv values for individual water gauging stations, as well as the highest 90% Tmax and Vmax quantiles were noted using LMOM to estimate the gamma distribution parameters. The highest 90% Tmax and Vmax quantiles were observed in the uppermost part of the studied area.


2014 ◽  
Vol 28 (2) ◽  
pp. 183-201 ◽  
Author(s):  
Percy H. Brill

We introduce a level-crossing analysis of the finite time-t probability distributions of the excess life, age, total life, and related quantities of renewal processes. The technique embeds the renewal process as one cycle of a regenerative process with a barrier at level t, whose limiting probability density function leads directly to the time-t quantities. The new method connects the analysis of renewal processes with the analysis of a large class of stochastic models of Operations Research. Examples are given.


2010 ◽  
Vol 49 (7) ◽  
pp. 1443-1453 ◽  
Author(s):  
Allan J. Clarke ◽  
Stephen Van Gorder ◽  
Yvette Everingham

Abstract The authors develop a method for the long-lead forecasting of El Niño–influenced rainfall probability and illustrate it using the economically important prediction, from the beginning of the year, of September–November (SON) rainfall in the coastal sugarcane producing region of Australia’s northeastern coast. The method is based on two probability distributions. One is the Gaussian error distribution of the long-lead prediction of the El Niño index Niño-3.4 by the Clarke and Van Gorder forecast method. The other is the relationship of the rainfall distribution to the Niño-3.4 index. The rainfall distribution can be approximated by a gamma distribution whose two parameters depend on Niño-3.4. To predict the rainfall at, say, the Tully Sugar, Ltd., mill on the north Queensland coast in SON 2009, the June–August (JJA) value of Niño-3.4 is predicted and then 1000 possible “observed” JJA Niño-3.4 values calculated from the error distribution. Each one of these observed Niño-3.4 values is then used, with the Niño-3.4-dependent gamma distribution for that location, to calculate 1000 possible SON rainfall totals. The result is one million possible SON rainfalls. A histogram of these rainfalls is the required probability distribution for the rainfall at that location predicted from the beginning of the year. Cross-validated predictions suggest that the method is successful.


2007 ◽  
Vol 19 (10) ◽  
pp. 2780-2796 ◽  
Author(s):  
Shun-ichi Amari

When there are a number of stochastic models in the form of probability distributions, one needs to integrate them. Mixtures of distributions are frequently used, but exponential mixtures also provide a good means of integration. This letter proposes a one-parameter family of integration, called α-integration, which includes all of these well-known integrations. These are generalizations of various averages of numbers such as arithmetic, geometric, and harmonic averages. There are psychophysical experiments that suggest that α-integrations are used in the brain. The α-divergence between two distributions is defined, which is a natural generalization of Kullback-Leibler divergence and Hellinger distance, and it is proved that α-integration is optimal in the sense of minimizing α-divergence. The theory is applied to generalize the mixture of experts and the product of experts to the α-mixture of experts. The α-predictive distribution is also stated in the Bayesian framework.


2002 ◽  
Vol 18 (2) ◽  
pp. 278-296 ◽  
Author(s):  
Katsuto Tanaka

The measurement error problem that we consider in this paper is concerned with the situation where time series data of various kinds—short memory, long memory, and random walk processes—are contaminated by white noise. We suggest a unified approach to testing for the existence of such noise. It is found that the power of our test crucially depends on the underlying process.


Sign in / Sign up

Export Citation Format

Share Document