The Orthogonal Motion Aftereffect

Perception ◽  
1996 ◽  
Vol 25 (1_suppl) ◽  
pp. 65-65 ◽  
Author(s):  
A Grunewald ◽  
M J M Lankheet

A recent model of motion perception suggests that the motion aftereffect (MAE) is due to an interaction across all directions, rather than just opposite directions (Grunewald, 1995 Perception24 Supplement, 111). According to the model, the MAE is caused by the interaction of broadly tuned inhibition and narrowly tuned excitation, both in direction space. The model correctly suggests that, after adaptation to opposite directions of motion, no MAE results. Unlike other accounts of the MAE, this model predicts that, after adaptation to opposite but broadly defined directions of motion, a MAE orthogonal to the inducing motions is observed. We tested this counter-intuitive prediction by adapting subjects to two populations of dots, whose average motion vectors were opposite, but which contained motion vectors deviating slightly (up to 30°) from the average direction. During the subsequent test phase, randomly moving dots were displayed. Subjects were asked to indicate whether they perceived any global motion during this phase, and if so, they were asked to indicate the perceived motion axis by aligning a line. Subjects were tested on four pairs of directions: vertical, horizontal, and the two diagonals. In all four conditions subjects reported seeing an MAE, and the axis that they indicated was always orthogonal to the inducing motions (ANOVA: p<0.001, accounted for 95% of variance). This experiment confirms the predictions made by the model, thus further supporting the interaction across all directions of narrowly tuned excitation and broadly tuned inhibition.

2019 ◽  
Vol 9 (1) ◽  
Author(s):  
Minsun Park ◽  
Randolph Blake ◽  
Yeseul Kim ◽  
Chai-Youn Kim

AbstractSensory information registered in one modality can influence perception associated with sensory information registered in another modality. The current work focuses on one particularly salient form of such multisensory interaction: audio-visual motion perception. Previous studies have shown that watching visual motion and listening to auditory motion influence each other, but results from those studies are mixed with regard to the nature of the interactions promoting that influence and where within the sequence of information processing those interactions transpire. To address these issues, we investigated whether (i) concurrent audio-visual motion stimulation during an adaptation phase impacts the strength of the visual motion aftereffect (MAE) during a subsequent test phase, and (ii) whether the magnitude of that impact was dependent on the congruence between auditory and visual motion experienced during adaptation. Results show that congruent direction of audio-visual motion during adaptation induced a stronger initial impression and a slower decay of the MAE than did the incongruent direction, which is not attributable to differential patterns of eye movements during adaptation. The audio-visual congruency effects measured here imply that visual motion perception emerges from integration of audio-visual motion information at a sensory neural stage of processing.


Perception ◽  
10.1068/p3256 ◽  
2002 ◽  
Vol 31 (5) ◽  
pp. 603-615 ◽  
Author(s):  
Tom C A Freeman ◽  
Jane H Sumnall

Observers can recover motion with respect to the head during an eye movement by comparing signals encoding retinal motion and the velocity of pursuit. Evidently there is a mismatch between these signals because perceived head-centred motion is not always veridical. One example is the Filehne illusion, in which a stationary object appears to move in the opposite direction to pursuit. Like the motion aftereffect, the phenomenal experience of the Filehne illusion is one in which the stimulus moves but does not seem to go anywhere. This raises problems when measuring the illusion by motion nulling because the more traditional technique confounds perceived motion with changes in perceived position. We devised a new nulling technique using global-motion stimuli that degraded familiar position cues but preserved cues to motion. Stimuli consisted of random-dot patterns comprising signal and noise dots that moved at the same retinal ‘base’ speed. Noise moved in random directions. In an eye-stationary speed-matching experiment we found noise slowed perceived retinal speed as ‘coherence strength’ (ie percentage of signal) was reduced. The effect occurred over the two-octave range of base speeds studied and well above direction threshold. When the same stimuli were combined with pursuit, observers were able to null the Filehne illusion by adjusting coherence. A power law relating coherence to retinal base speed fit the data well with a negative exponent. Eye-movement recordings showed that pursuit was quite accurate. We then tested the hypothesis that the stimuli found at the null-points appeared to move at the same retinal speed. Two observers supported the hypothesis, a third partially, and a fourth showed a small linear trend. In addition, the retinal speed found by the traditional Filehne technique was similar to the matches obtained with the global-motion stimuli. The results provide support for the idea that speed is the critical cue in head-centred motion perception.


Author(s):  
Burkhard Müller ◽  
Jürgen Gehrke

Abstract. Planning interactions with the physical world requires knowledge about operations; in short, mental operators. Abstractness of content and directionality of access are two important properties to characterize the representational units of this kind of knowledge. Combining these properties allows four classes of knowledge units to be distinguished that can be found in the literature: (a) rules, (b) mental models or schemata, (c) instances, and (d) episodes or chunks. The influence of practicing alphabet-arithmetic operators in a prognostic, diagnostic, or retrognostic way (A + 2 = ?, A? = C, or ? + 2 = C, respectively) on the use of that knowledge in a subsequent test was used to assess the importance of these dimensions. At the beginning, the retrognostic use of knowledge was worse than the prognostic use, although identical operations were involved (A + 2 = ? vs. ? - 2 = A). This disadvantage was reduced with increased practice. Test performance was best if the task and the letter pairs were the same as in the acquisition phase. Overall, the findings support theories proposing multiple representational units of mental operators. The disadvantage for the retrognosis task was recovered in the test phase, and may be evidence for the importance of the order of events independent of the order of experience.


2007 ◽  
Vol 47 (7) ◽  
pp. 887-898 ◽  
Author(s):  
Deborah Giaschi ◽  
Amy Zwicker ◽  
Simon Au Young ◽  
Bruce Bjornson

Perception ◽  
1997 ◽  
Vol 26 (3) ◽  
pp. 269-275 ◽  
Author(s):  
Timothy J Andrews ◽  
Allison N McCoy

When rotating stripes or other periodic stimuli cross the retina at a critical rate, a reversal in the direction of motion of the stimuli is often seen. This illusion of motion perception was used to explore the roles of retinal and perceived motion in the generation of optokinetic nystagmus. Here we show that optokinetic nystagmus is disrupted during the perception of this illusion. Thus, when perceived and actual motion are in conflict, subjects fail to track the veridical movement. This observation suggests that the perception of motion can directly influence optokinetic nystagmus, even in the presence of a moving retinal image. A conflict in the neural representation of motion in different brain areas may explain these findings.


2002 ◽  
Vol 13 (1) ◽  
pp. 75-84 ◽  
Author(s):  
Yuji Kobayashi ◽  
Aihide Yoshino ◽  
Tsuneyuki Ogasawara ◽  
Soichiro Nomura

2012 ◽  
Vol 107 (12) ◽  
pp. 3493-3508 ◽  
Author(s):  
Kaoru Amano ◽  
Tsunehiro Takeda ◽  
Tomoki Haji ◽  
Masahiko Terao ◽  
Kazushi Maruya ◽  
...  

Early visual motion signals are local and one-dimensional (1-D). For specification of global two-dimensional (2-D) motion vectors, the visual system should appropriately integrate these signals across orientation and space. Previous neurophysiological studies have suggested that this integration process consists of two computational steps (estimation of local 2-D motion vectors, followed by their spatial pooling), both being identified in the area MT. Psychophysical findings, however, suggest that under certain stimulus conditions, the human visual system can also compute mathematically correct global motion vectors from direct pooling of spatially distributed 1-D motion signals. To study the neural mechanisms responsible for this novel 1-D motion pooling, we conducted human magnetoencephalography (MEG) and functional MRI experiments using a global motion stimulus comprising multiple moving Gabors (global-Gabor motion). In the first experiment, we measured MEG and blood oxygen level-dependent responses while changing motion coherence of global-Gabor motion. In the second experiment, we investigated cortical responses correlated with direction-selective adaptation to the global 2-D motion, not to local 1-D motions. We found that human MT complex (hMT+) responses show both coherence dependency and direction selectivity to global motion based on 1-D pooling. The results provide the first evidence that hMT+ is the locus of 1-D motion pooling, as well as that of conventional 2-D motion pooling.


Sign in / Sign up

Export Citation Format

Share Document