scholarly journals Special edition. Visual perception and picture quality. 2. Visual motion perception and eye movement. 2-1. A two stage model for visual motion perception including applicability to binocular depth perception.

1991 ◽  
Vol 45 (3) ◽  
pp. 299-305
Author(s):  
Masami Ogata ◽  
Takao Sato
2013 ◽  
Vol 26 (4) ◽  
pp. 317-332 ◽  
Author(s):  
Yasuhiro Takeshima ◽  
Jiro Gyoba

Several studies have examined the effects of auditory stimuli on visual perception. In studies of cross-modal correspondences, auditory pitch has been shown to modulate visual motion perception. In particular, low-reliability visual motion stimuli tend to be affected by metaphorically or physically congruent or incongruent sounds. In the present study, we examined the modulatory effects of auditory pitch on visual perception of motion trajectory for visual inputs of varying reliability. Our results indicated that an auditory pitch implying the illusory motion toward the outside of the visual field-modulated perceived motion trajectory. In contrast, auditory pitch implying the illusory motion toward the central visual field did not affect the perception of motion trajectory. This asymmetrical effect of auditory stimuli occurred depending on the reliability of the visual input. Moreover, sounds that corresponded in terms of their pitch-elevation mapping altered the perception of the trajectory of visual motion when apparent motion could be perceived smoothly. Therefore, the present results demonstrate that auditory stimuli modulate visual motion perception especially when smooth motion is perceived in the peripheral visual field.


2020 ◽  
Author(s):  
Xiuyun Wu ◽  
Austin C. Rothwell ◽  
Miriam Spering ◽  
Anna Montagnini

AbstractSmooth pursuit eye movements and visual motion perception rely on the integration of current sensory signals with past experience. Experience shapes our expectation of current visual events and can drive eye movement responses made in anticipation of a target, such as anticipatory pursuit. Previous research revealed consistent effects of expectation on anticipatory pursuit—eye movements follow the expected target direction or speed—and contrasting effects on motion perception, but most studies considered either eye movement or perceptual responses. The current study directly compared effects of direction expectation on perception and anticipatory pursuit within the same direction discrimination task to investigate whether both types of responses are affected similarly or differently. Observers (n = 10) viewed high-coherence random-dot kinematograms (RDKs) moving rightward and leftward with a probability of 50, 70, or 90% in a given block of trials to build up an expectation of motion direction. They were asked to judge motion direction of interleaved low-coherence RDKs (0-15%). Perceptual judgements were compared to changes in anticipatory pursuit eye movements as a function of probability. Results show that anticipatory pursuit velocity scaled with probability and followed direction expectation (attraction bias), whereas perceptual judgments were biased opposite to direction expectation (repulsion bias). Control experiments suggest that the repulsion bias in perception was not caused by retinal slip induced by anticipatory pursuit, or by motion adaptation. We conclude that direction expectation can be processed differently for perception and anticipatory pursuit.


2015 ◽  
Vol 25 (61) ◽  
pp. 251-259 ◽  
Author(s):  
Francisco Carlos Nather ◽  
Vinicius Anelli ◽  
Guilherme Ennes ◽  
José Lino Oliveira Bueno

Visual perception is adapted toward a better understanding of our own movements than those of non-conspecifics. The present study determined whether time perception is affected by pictures of different species by considering the evolutionary scale. Static (“S”) and implied movement (“M”) images of a dog, cheetah, chimpanzee, and man were presented to undergraduate students. S and M images of the same species were presented in random order or one after the other (S-M or M-S) for two groups of participants. Movement, Velocity, and Arousal semantic scales were used to characterize some properties of the images. Implied movement affected time perception, in which M images were overestimated. The results are discussed in terms of visual motion perception related to biological timing processing that could be established early in terms of the adaptation of humankind to the environment.


2019 ◽  
Vol 5 (1) ◽  
pp. 247-268 ◽  
Author(s):  
Peter Thier ◽  
Akshay Markanday

The cerebellar cortex is a crystal-like structure consisting of an almost endless repetition of a canonical microcircuit that applies the same computational principle to different inputs. The output of this transformation is broadcasted to extracerebellar structures by way of the deep cerebellar nuclei. Visually guided eye movements are accommodated by different parts of the cerebellum. This review primarily discusses the role of the oculomotor part of the vermal cerebellum [the oculomotor vermis (OMV)] in the control of visually guided saccades and smooth-pursuit eye movements. Both types of eye movements require the mapping of retinal information onto motor vectors, a transformation that is optimized by the OMV, considering information on past performance. Unlike the role of the OMV in the guidance of eye movements, the contribution of the adjoining vermal cortex to visual motion perception is nonmotor and involves a cerebellar influence on information processing in the cerebral cortex.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Sichao Yang ◽  
Johannes Bill ◽  
Jan Drugowitsch ◽  
Samuel J. Gershman

AbstractMotion relations in visual scenes carry an abundance of behaviorally relevant information, but little is known about how humans identify the structure underlying a scene’s motion in the first place. We studied the computations governing human motion structure identification in two psychophysics experiments and found that perception of motion relations showed hallmarks of Bayesian structural inference. At the heart of our research lies a tractable task design that enabled us to reveal the signatures of probabilistic reasoning about latent structure. We found that a choice model based on the task’s Bayesian ideal observer accurately matched many facets of human structural inference, including task performance, perceptual error patterns, single-trial responses, participant-specific differences, and subjective decision confidence—especially, when motion scenes were ambiguous and when object motion was hierarchically nested within other moving reference frames. Our work can guide future neuroscience experiments to reveal the neural mechanisms underlying higher-level visual motion perception.


Sign in / Sign up

Export Citation Format

Share Document