scholarly journals Perceptual saccadic suppression starts in the retina

2019 ◽  
Author(s):  
Saad Idrees ◽  
Matthias P. Baumann ◽  
Felix Franke ◽  
Thomas A. Münch ◽  
Ziad M. Hafed

AbstractVisual sensitivity, probed through perceptual detectability of very brief visual stimuli, is strongly impaired around the time of rapid eye movements. This robust perceptual phenomenon, called saccadic suppression, is frequently attributed to active suppressive signals that are directly derived from eye movement commands. Here we show instead that visual-only mechanisms, activated by saccade-induced image shifts, can account for all perceptual properties of saccadic suppression that we have investigated. Such mechanisms start at, but are not necessarily exclusive to, the very first stage of visual processing in the brain, the retina. Critically, neural suppression originating in the retina outlasts perceptual suppression around the time of saccades, suggesting that extra-retinal movement-related signals, rather than causing suppression, may instead act to shorten it. Our results demonstrate a far-reaching contribution of visual processing mechanisms to perceptual saccadic suppression, starting in the retina, without the need to invoke explicit motor-based suppression commands.

2018 ◽  
Vol 119 (6) ◽  
pp. 2059-2067 ◽  
Author(s):  
Chris Scholes ◽  
Paul V. McGraw ◽  
Neil W. Roach

During periods of steady fixation, we make small-amplitude ocular movements, termed microsaccades, at a rate of 1–2 every second. Early studies provided evidence that visual sensitivity is reduced during microsaccades—akin to the well-established suppression associated with larger saccades. However, the results of more recent work suggest that microsaccades may alter retinal input in a manner that enhances visual sensitivity to some stimuli. Here we parametrically varied the spatial frequency of a stimulus during a detection task and tracked contrast sensitivity as a function of time relative to microsaccades. Our data reveal two distinct modulations of sensitivity: suppression during the eye movement itself and facilitation after the eye has stopped moving. The magnitude of suppression and facilitation of visual sensitivity is related to the spatial content of the stimulus: suppression is greatest for low spatial frequencies, while sensitivity is enhanced most for stimuli of 1–2 cycles/°, spatial frequencies at which we are already most sensitive in the absence of eye movements. We present a model in which the tuning of suppression and facilitation is explained by delayed lateral inhibition between spatial frequency channels. Our data show that eye movements actively modulate visual sensitivity even during fixation: the detectability of images at different spatial scales can be increased or decreased depending on when the image occurs relative to a microsaccade. NEW & NOTEWORTHY Given the frequency with which we make microsaccades during periods of fixation, it is vital that we understand how they affect visual processing. We demonstrate two selective modulations of contrast sensitivity that are time-locked to the occurrence of a microsaccade: suppression of low spatial frequencies during each eye movement and enhancement of higher spatial frequencies after the eye has stopped moving. These complementary changes may arise naturally because of sluggish gain control between spatial channels.


2021 ◽  
Author(s):  
Ifedayo-Emmmanuel Adeyefa-Olasupo

Despite the incessant retinal disruptions that necessarily accompany eye movements, our percept of the visual world remains continuous and stable—a phenomenon referred to as spatial constancy. How the visual system achieves spatial constancy remains unclear despite almost four centuries worth of experimentation. Here I measured visual sensitivity at geometrically symmetric locations, observing transient sensitivity differences between them where none should be observed if cells that support spatial constancy indeed faithfully translate or converge. These differences, recapitulated by a novel neurobiological mechanical model, reflect an overriding influence of putative visually transient error signals that curve visual space. Intermediate eccentric locations likely to contain retinal disruptions are uniquely affected by curved visual space, suggesting that visual processing at these locations is transiently turned off before an eye movement, and with the gating off of these error signals, turned back on after an eye-movement— a possible mechanism underlying spatial constancy.


Sensors ◽  
2021 ◽  
Vol 21 (15) ◽  
pp. 5178
Author(s):  
Sangbong Yoo ◽  
Seongmin Jeong ◽  
Seokyeon Kim ◽  
Yun Jang

Gaze movement and visual stimuli have been utilized to analyze human visual attention intuitively. Gaze behavior studies mainly show statistical analyses of eye movements and human visual attention. During these analyses, eye movement data and the saliency map are presented to the analysts as separate views or merged views. However, the analysts become frustrated when they need to memorize all of the separate views or when the eye movements obscure the saliency map in the merged views. Therefore, it is not easy to analyze how visual stimuli affect gaze movements since existing techniques focus excessively on the eye movement data. In this paper, we propose a novel visualization technique for analyzing gaze behavior using saliency features as visual clues to express the visual attention of an observer. The visual clues that represent visual attention are analyzed to reveal which saliency features are prominent for the visual stimulus analysis. We visualize the gaze data with the saliency features to interpret the visual attention. We analyze the gaze behavior with the proposed visualization to evaluate that our approach to embedding saliency features within the visualization supports us to understand the visual attention of an observer.


1999 ◽  
Vol 82 (5) ◽  
pp. 2612-2632 ◽  
Author(s):  
Pierre A. Sylvestre ◽  
Kathleen E. Cullen

The mechanics of the eyeball and its surrounding tissues, which together form the oculomotor plant, have been shown to be the same for smooth pursuit and saccadic eye movements. Hence it was postulated that similar signals would be carried by motoneurons during slow and rapid eye movements. In the present study, we directly addressed this proposal by determining which eye movement–based models best describe the discharge dynamics of primate abducens neurons during a variety of eye movement behaviors. We first characterized abducens neuron spike trains, as has been classically done, during fixation and sinusoidal smooth pursuit. We then systematically analyzed the discharge dynamics of abducens neurons during and following saccades, during step-ramp pursuit and during high velocity slow-phase vestibular nystagmus. We found that the commonly utilized first-order description of abducens neuron firing rates (FR = b + kE + rE˙, where FR is firing rate, E and E˙ are eye position and velocity, respectively, and b, k, and r are constants) provided an adequate model of neuronal activity during saccades, smooth pursuit, and slow phase vestibular nystagmus. However, the use of a second-order model, which included an exponentially decaying term or “slide” (FR = b + kE + rE˙ + uË − c[Formula: see text]), notably improved our ability to describe neuronal activity when the eye was moving and also enabled us to model abducens neuron discharges during the postsaccadic interval. We also found that, for a given model, a single set of parameters could not be used to describe neuronal firing rates during both slow and rapid eye movements. Specifically, the eye velocity and position coefficients ( r and k in the above models, respectively) consistently decreased as a function of the mean (and peak) eye velocity that was generated. In contrast, the bias ( b, firing rate when looking straight ahead) invariably increased with eye velocity. Although these trends are likely to reflect, in part, nonlinearities that are intrinsic to the extraocular muscles, we propose that these results can also be explained by considering the time-varying resistance to movement that is generated by the antagonist muscle. We conclude that to create realistic and meaningful models of the neural control of horizontal eye movements, it is essential to consider the activation of the antagonist, as well as agonist motoneuron pools.


2021 ◽  
pp. 2150048
Author(s):  
Hamidreza Namazi ◽  
Avinash Menon ◽  
Ondrej Krejcar

Our eyes are always in search of exploring our surrounding environment. The brain controls our eyes’ activities through the nervous system. Hence, analyzing the correlation between the activities of the eyes and brain is an important area of research in vision science. This paper evaluates the coupling between the reactions of the eyes and the brain in response to different moving visual stimuli. Since both eye movements and EEG signals (as the indicator of brain activity) contain information, we employed Shannon entropy to decode the coupling between them. Ten subjects looked at four moving objects (dynamic visual stimuli) with different information contents while we recorded their EEG signals and eye movements. The results demonstrated that the changes in the information contents of eye movements and EEG signals are strongly correlated ([Formula: see text]), which indicates a strong correlation between brain and eye activities. This analysis could be extended to evaluate the correlation between the activities of other organs versus the brain.


2019 ◽  
Vol 116 (6) ◽  
pp. 2027-2032 ◽  
Author(s):  
Jasper H. Fabius ◽  
Alessio Fracasso ◽  
Tanja C. W. Nijboer ◽  
Stefan Van der Stigchel

Humans move their eyes several times per second, yet we perceive the outside world as continuous despite the sudden disruptions created by each eye movement. To date, the mechanism that the brain employs to achieve visual continuity across eye movements remains unclear. While it has been proposed that the oculomotor system quickly updates and informs the visual system about the upcoming eye movement, behavioral studies investigating the time course of this updating suggest the involvement of a slow mechanism, estimated to take more than 500 ms to operate effectively. This is a surprisingly slow estimate, because both the visual system and the oculomotor system process information faster. If spatiotopic updating is indeed this slow, it cannot contribute to perceptual continuity, because it is outside the temporal regime of typical oculomotor behavior. Here, we argue that the behavioral paradigms that have been used previously are suboptimal to measure the speed of spatiotopic updating. In this study, we used a fast gaze-contingent paradigm, using high phi as a continuous stimulus across eye movements. We observed fast spatiotopic updating within 150 ms after stimulus onset. The results suggest the involvement of a fast updating mechanism that predictively influences visual perception after an eye movement. The temporal characteristics of this mechanism are compatible with the rate at which saccadic eye movements are typically observed in natural viewing.


1998 ◽  
Vol 9 (5) ◽  
pp. 379-385 ◽  
Author(s):  
Jan Theeuwes ◽  
Arthur F. Kramer ◽  
Sowon Hahn ◽  
David E. Irwin

Observers make rapid eye movements to examine the world around them. Before an eye movement is made, attention is covertly shifted to the location of the object of interest. The eyes typically will land at the position at which attention is directed. Here we report that a goal-directed eye movement toward a uniquely colored object is disrupted by the appearance of a new but task-irrelevant object, unless subjects have a sufficient amount of time to focus their attention on the location of the target prior to the appearance of the new object. In many instances, the eyes started moving toward the new object before gaze started to shift to the color-singleton target. The eyes often landed for a very short period of time (25–150 ms) near the new object. The results suggest parallel programming of two saccades: one voluntary, goal-directed eye movement toward the color-singleton target and one stimulus-driven eye movement reflexively elicited by the appearance of the new object. Neuroanatomical structures responsible for parallel programming of saccades are discussed.


Perception ◽  
10.1068/p3470 ◽  
2003 ◽  
Vol 32 (7) ◽  
pp. 793-804 ◽  
Author(s):  
Nicholas J Wade ◽  
Benjamin W Tatler ◽  
Dieter Heller

Dodge, in 1916, suggested that the French term ‘saccade’ should be used for describing the rapid movements of the eyes that occur while reading. Previously he had referred to these as type I movements. Javal had used the term ‘saccade’ in 1879, when describing experiments conducted in his laboratory by Lamare. Accordingly, Javal has been rightly credited with assigning the term to rapid eye movements. In English these rapid rotations had been called jerks, and they had been observed and measured before Lamare's studies of reading. Rapid sweeps of the eyes occur as one phase of nystagmus; they were observed by Wells in 1792 who used an afterimage technique, and they were illustrated by Crum Brown in 1878. Afterimages were used in nineteenth-century research on eye movements and eye position; they were also employed by Hering in 1879, to ascertain how the eyes moved during reading. In the previous year, Javal had employed afterimages in his investigations of reading, but this was to demonstrate that the eyes moved horizontally rather than vertically. Hering's and Lamare's auditory method established the discontinuous nature of eye movements during reading, and the photographic methods introduced by Dodge and others in the early twentieth century enabled their characteristics to be determined with greater accuracy.


2015 ◽  
Vol 112 (40) ◽  
pp. E5523-E5532 ◽  
Author(s):  
Peter T. Weir ◽  
Michael H. Dickinson

Although anatomy is often the first step in assigning functions to neural structures, it is not always clear whether architecturally distinct regions of the brain correspond to operational units. Whereas neuroarchitecture remains relatively static, functional connectivity may change almost instantaneously according to behavioral context. We imaged panneuronal responses to visual stimuli in a highly conserved central brain region in the fruit fly, Drosophila, during flight. In one substructure, the fan-shaped body, automated analysis revealed three layers that were unresponsive in quiescent flies but became responsive to visual stimuli when the animal was flying. The responses of these regions to a broad suite of visual stimuli suggest that they are involved in the regulation of flight heading. To identify the cell types that underlie these responses, we imaged activity in sets of genetically defined neurons with arborizations in the targeted layers. The responses of this collection during flight also segregated into three sets, confirming the existence of three layers, and they collectively accounted for the panneuronal activity. Our results provide an atlas of flight-gated visual responses in a central brain circuit.


Perception ◽  
1989 ◽  
Vol 18 (2) ◽  
pp. 257-264 ◽  
Author(s):  
Catherine Neary ◽  
Arnold J Wilkins

When a rapid eye movement (saccade) is made across material displayed on cathode ray tube monitors with short-persistence phosphors, various perceptual phenomena occur. The phenomena do not occur when the monitor has a long-persistence phosphor. These phenomena were observed for certain spatial arrays, their possible physiological basis noted, and their effect on the control of eye movements examined. When the display consisted simply of two dots, and a saccade was made from one to the other, a transient ghost image was seen just beyond the destination target. When the display consisted of vertical lines, tilting and displacement of the lines occurred. The phenomena were more intrusive for the latter display and there was a significant increase in the number of corrective saccades. These results are interpreted in terms of the effects of fluctuating illumination (and hence phosphor persistence) on saccadic suppression.


Sign in / Sign up

Export Citation Format

Share Document