scholarly journals Motion versus Position in the Perception of Head-Centred Movement

Perception ◽  
10.1068/p3256 ◽  
2002 ◽  
Vol 31 (5) ◽  
pp. 603-615 ◽  
Author(s):  
Tom C A Freeman ◽  
Jane H Sumnall

Observers can recover motion with respect to the head during an eye movement by comparing signals encoding retinal motion and the velocity of pursuit. Evidently there is a mismatch between these signals because perceived head-centred motion is not always veridical. One example is the Filehne illusion, in which a stationary object appears to move in the opposite direction to pursuit. Like the motion aftereffect, the phenomenal experience of the Filehne illusion is one in which the stimulus moves but does not seem to go anywhere. This raises problems when measuring the illusion by motion nulling because the more traditional technique confounds perceived motion with changes in perceived position. We devised a new nulling technique using global-motion stimuli that degraded familiar position cues but preserved cues to motion. Stimuli consisted of random-dot patterns comprising signal and noise dots that moved at the same retinal ‘base’ speed. Noise moved in random directions. In an eye-stationary speed-matching experiment we found noise slowed perceived retinal speed as ‘coherence strength’ (ie percentage of signal) was reduced. The effect occurred over the two-octave range of base speeds studied and well above direction threshold. When the same stimuli were combined with pursuit, observers were able to null the Filehne illusion by adjusting coherence. A power law relating coherence to retinal base speed fit the data well with a negative exponent. Eye-movement recordings showed that pursuit was quite accurate. We then tested the hypothesis that the stimuli found at the null-points appeared to move at the same retinal speed. Two observers supported the hypothesis, a third partially, and a fourth showed a small linear trend. In addition, the retinal speed found by the traditional Filehne technique was similar to the matches obtained with the global-motion stimuli. The results provide support for the idea that speed is the critical cue in head-centred motion perception.

2019 ◽  
Vol 121 (5) ◽  
pp. 1787-1797
Author(s):  
David Souto ◽  
Jayesha Chudasama ◽  
Dirk Kerzel ◽  
Alan Johnston

Smooth pursuit eye movements (pursuit) are used to minimize the retinal motion of moving objects. During pursuit, the pattern of motion on the retina carries not only information about the object movement but also reafferent information about the eye movement itself. The latter arises from the retinal flow of the stationary world in the direction opposite to the eye movement. To extract the global direction of motion of the tracked object and stationary world, the visual system needs to integrate ambiguous local motion measurements (i.e., the aperture problem). Unlike the tracked object, the stationary world’s global motion is entirely determined by the eye movement and thus can be approximately derived from motor commands sent to the eye (i.e., from an efference copy). Because retinal motion opposite to the eye movement is dominant during pursuit, different motion integration mechanisms might be used for retinal motion in the same direction and opposite to pursuit. To investigate motion integration during pursuit, we tested direction discrimination of a brief change in global object motion. The global motion stimulus was a circular array of small static apertures within which one-dimensional gratings moved. We found increased coherence thresholds and a qualitatively different reflexive ocular tracking for global motion opposite to pursuit. Both effects suggest reduced sampling of motion opposite to pursuit, which results in an impaired ability to extract coherence in motion signals in the reafferent direction. We suggest that anisotropic motion integration is an adaptation to asymmetric retinal motion patterns experienced during pursuit eye movements. NEW & NOTEWORTHY This study provides a new understanding of how the visual system achieves coherent perception of an object’s motion while the eyes themselves are moving. The visual system integrates local motion measurements to create a coherent percept of object motion. An analysis of perceptual judgments and reflexive eye movements to a brief change in an object’s global motion confirms that the visual and oculomotor systems pick fewer samples to extract global motion opposite to the eye movement.


Perception ◽  
1996 ◽  
Vol 25 (1_suppl) ◽  
pp. 65-65 ◽  
Author(s):  
A Grunewald ◽  
M J M Lankheet

A recent model of motion perception suggests that the motion aftereffect (MAE) is due to an interaction across all directions, rather than just opposite directions (Grunewald, 1995 Perception24 Supplement, 111). According to the model, the MAE is caused by the interaction of broadly tuned inhibition and narrowly tuned excitation, both in direction space. The model correctly suggests that, after adaptation to opposite directions of motion, no MAE results. Unlike other accounts of the MAE, this model predicts that, after adaptation to opposite but broadly defined directions of motion, a MAE orthogonal to the inducing motions is observed. We tested this counter-intuitive prediction by adapting subjects to two populations of dots, whose average motion vectors were opposite, but which contained motion vectors deviating slightly (up to 30°) from the average direction. During the subsequent test phase, randomly moving dots were displayed. Subjects were asked to indicate whether they perceived any global motion during this phase, and if so, they were asked to indicate the perceived motion axis by aligning a line. Subjects were tested on four pairs of directions: vertical, horizontal, and the two diagonals. In all four conditions subjects reported seeing an MAE, and the axis that they indicated was always orthogonal to the inducing motions (ANOVA: p<0.001, accounted for 95% of variance). This experiment confirms the predictions made by the model, thus further supporting the interaction across all directions of narrowly tuned excitation and broadly tuned inhibition.


2010 ◽  
Vol 10 (7) ◽  
pp. 543-543
Author(s):  
T. C. Freeman ◽  
R. A. Champion ◽  
P. A. Warren

i-Perception ◽  
2020 ◽  
Vol 11 (5) ◽  
pp. 204166952096110
Author(s):  
Chien-Chung Chen ◽  
Hiroshi Ashida ◽  
Xirui Yang ◽  
Pei-Yin Chen

In a stimulus with multiple moving elements, an observer may perceive that the whole stimulus moves in unison if (a) one can associate an element in one frame with one in the next (correspondence) and (b) a sufficient proportion of correspondences signal a similar motion direction (coherence). We tested the necessity of these two conditions by asking the participants to rate the perceived intensity of linear, concentric, and radial motions for three types of stimuli: (a) random walk motion, in which the direction of each dot was randomly determined for each frame, (b) random image sequence, which was a set of uncorrelated random dot images presented in sequence, and (c) global motion, in which 35% of dots moved coherently. The participants perceived global motion not only in the global motion conditions but also in the random image sequences, though not in random walk motion. The type of perceived motion in the random image sequences depends on the spatial context of the stimuli. Thus, although there is neither a fixed correspondence across different frames nor a coherent motion direction, observers can still perceive global motion in the random image sequence. This result cannot be explained by motion energy or local aperture border effects.


2003 ◽  
Vol 3 (11) ◽  
pp. 11 ◽  
Author(s):  
Tom C. A. Freeman ◽  
Jane H. Sumnall ◽  
Robert J. Snowden

Perception ◽  
1993 ◽  
Vol 22 (11) ◽  
pp. 1365-1380 ◽  
Author(s):  
Nicholas J Wade ◽  
Michael T Swanston ◽  
Charles M M de Weert

A brief history of quantitative assessments of interocular transfer (IOT) of the motion aftereffect (MAE) is presented. Recent research indicates that the MAE occurs as a consequence of adapting detectors for relative rather than retinal motion. When gratings above and below a stationary, fixated grating are moved in an otherwise dark field the central, retinally stationary grating appears to move in the opposite direction; when tested with stationary gratings an MAE is almost entirely confined to the central grating. The IOT of such an MAE was measured in experiment 1: the display was presented to one eye with a black field in the other. The IOT was about 30% of the monocular MAE. Similar values were found in experiment 2, in which the contralateral eye received an equivalent central stationary grating during adaptation and test. The dichoptic interaction of the processes involved in the MAE was examined by presenting the central gratings to both eyes and a single flanking grating above in one eye and below in the other (experiment 3). The MAE was tested with either the same or the contralateral pairing. Oppositely directed MAEs were found for the central and flanking gratings, but they were confined mainly to the conditions in which the configurations presented during adaptation were present in the same eyes during test. In experiment 4, the surround MAEs were compared after adaptation with two moving gratings in one eye or with a similar dichoptic configuration, and they were of similar duration. In a final experiment the MAE was tested either monocularly or binocularly after alternating adaptation of the left and right eyes and was found to be of the same duration. It is concluded that the MAE is a consequence of adapting relational-motion detectors, which are either monocular or of the binocular OR class.


Perception ◽  
1994 ◽  
Vol 23 (10) ◽  
pp. 1257-1264 ◽  
Author(s):  
Michael T Swanston

Evidence concerning the origin of the motion aftereffect (MAE) is assessed in terms of a model of levels of representation in visual motion perception proposed by Wade and Swanston. Very few experiments have been designed so as to permit unambiguous conclusions to be drawn. The requirements for such experiments are identified. Whereas retinocentric motion could in principle give rise to the MAE, data are not available which would enable a conclusion to be drawn. There is good evidence for a patterncentric origin, indicating that the MAE is primarily the result of adaptation in the systems responsible for detecting relative visual motion. There is evidence for a further contribution from the process that compensates retinocentric motion for eye movements, in the form of nonveridical information for eye movements. There may also be an effect at the level at which perceived distance and self-movement information are combined with egocentric motion to give a geocentric representation which provides the basis for reports of phenomenal experience. It is concluded that the MAE can be caused by changes in activity at more than one level of representation, and cannot be ascribed to a single underlying process.


Sign in / Sign up

Export Citation Format

Share Document