MST Neurons Code for Visual Motion in Space Independent of Pursuit Eye Movements

2007 ◽  
Vol 97 (5) ◽  
pp. 3473-3483 ◽  
Author(s):  
Naoko Inaba ◽  
Shigeru Shinomoto ◽  
Shigeru Yamane ◽  
Aya Takemura ◽  
Kenji Kawano

When a person tracks a small moving object, the visual images in the background of the visual scene move across his/her retina. It, however, is possible to estimate the actual motion of the images despite the eye-movement-induced motion. To understand the neural mechanism that reconstructs a stable visual world independent of eye movements, we explored areas MT (middle temporal) and MST (medial superior temporal) in the monkey cortex, both of which are known to be essential for visual motion analysis. We recorded the responses of neurons to a moving textured image that appeared briefly on the screen while the monkeys were performing smooth pursuit or stationary fixation tasks. Although neurons in both areas exhibited significant responses to the motion of the textured image with directional selectivity, the responses of MST neurons were mostly correlated with the motion of the image on the screen independent of pursuit eye movement, whereas the responses of MT neurons were mostly correlated with the motion of the image on the retina. Thus these MST neurons were more likely than MT neurons to distinguish between external and self-induced motion. The results are consistent with the idea that MST neurons code for visual motion in the external world while compensating for the counter-rotation of retinal images due to pursuit eye movements.

1988 ◽  
Vol 60 (3) ◽  
pp. 940-965 ◽  
Author(s):  
M. R. Dursteler ◽  
R. H. Wurtz

1. Previous experiments have shown that punctate chemical lesions within the middle temporal area (MT) of the superior temporal sulcus (STS) produce deficits in the initiation and maintenance of pursuit eye movements (10, 34). The present experiments were designed to test the effect of such chemical lesions in an area within the STS to which MT projects, the medial superior temporal area (MST). 2. We injected ibotenic acid into localized regions of MST, and we observed two deficits in pursuit eye movements, a retinotopic deficit and a directional deficit. 3. The retinotopic deficit in pursuit initiation was characterized by the monkey's inability to match eye speed to target speed or to adjust the amplitude of the saccade made to acquire the target to compensate for target motion. This deficit was related to the initiation of pursuit to targets moving in any direction in the visual field contralateral to the side of the brain with the lesion. This deficit was similar to the deficit we found following damage to extrafoveal MT except that the affected area of the visual field frequently extended throughout the entire contralateral visual field tested. 4. The directional deficit in pursuit maintenance was characterized by a failure to match eye speed to target speed once the fovea had been brought near the moving target. This deficit occurred only when the target was moving toward the side of the lesion, regardless of whether the target began to move in the ipsilateral or contralateral visual field. There was no deficit in the amplitude of saccades made to acquire the target, or in the amplitude of the catch-up saccades made to compensate for the slowed pursuit. The directional deficit is similar to the one we described previously following chemical lesions of the foveal representation in the STS. 5. Retinotopic deficits resulted from any of our injections in MST. Directional deficits resulted from lesions limited to subregions within MST, particularly lesions that invaded the floor of the STS and the posterior bank of the STS just lateral to MT. Extensive damage to the densely myelinated area of the anterior bank or to the posterior parietal area on the dorsal lip of the anterior bank produced minimal directional deficits. 6. We conclude that damage to visual motion processing in MST underlies the retinotopic pursuit deficit just as it does in MT. MST appears to be a sequential step in visual motion processing that occurs before all of the visual motion information is transmitted to the brainstem areas related to pursuit.(ABSTRACT TRUNCATED AT 400 WORDS)


1997 ◽  
Vol 14 (2) ◽  
pp. 323-338 ◽  
Author(s):  
Vincent P. Ferrera ◽  
Stephen G. Lisberger

AbstractAs a step toward understanding the mechanism by which targets are selected for smooth-pursuit eye movements, we examined the behavior of the pursuit system when monkeys were presented with two discrete moving visual targets. Two rhesus monkeys were trained to select a small moving target identified by its color in the presence of a moving distractor of another color. Smooth-pursuit eye movements were quantified in terms of the latency of the eye movement and the initial eye acceleration profile. We have previously shown that the latency of smooth pursuit, which is normally around 100 ms, can be extended to 150 ms or shortened to 85 ms depending on whether there is a distractor moving in the opposite or same direction, respectively, relative to the direction of the target. We have now measured this effect for a 360 deg range of distractor directions, and distractor speeds of 5–45 deg/s. We have also examined the effect of varying the spatial separation and temporal asynchrony between target and distractor. The results indicate that the effect of the distractor on the latency of pursuit depends on its direction of motion, and its spatial and temporal proximity to the target, but depends very little on the speed of the distractor. Furthermore, under the conditions of these experiments, the direction of the eye movement that is emitted in response to two competing moving stimuli is not a vectorial combination of the stimulus motions, but is solely determined by the direction of the target. The results are consistent with a competitive model for smooth-pursuit target selection and suggest that the competition takes place at a stage of the pursuit pathway that is between visual-motion processing and motor-response preparation.


2020 ◽  
Author(s):  
Xiuyun Wu ◽  
Austin C. Rothwell ◽  
Miriam Spering ◽  
Anna Montagnini

AbstractSmooth pursuit eye movements and visual motion perception rely on the integration of current sensory signals with past experience. Experience shapes our expectation of current visual events and can drive eye movement responses made in anticipation of a target, such as anticipatory pursuit. Previous research revealed consistent effects of expectation on anticipatory pursuit—eye movements follow the expected target direction or speed—and contrasting effects on motion perception, but most studies considered either eye movement or perceptual responses. The current study directly compared effects of direction expectation on perception and anticipatory pursuit within the same direction discrimination task to investigate whether both types of responses are affected similarly or differently. Observers (n = 10) viewed high-coherence random-dot kinematograms (RDKs) moving rightward and leftward with a probability of 50, 70, or 90% in a given block of trials to build up an expectation of motion direction. They were asked to judge motion direction of interleaved low-coherence RDKs (0-15%). Perceptual judgements were compared to changes in anticipatory pursuit eye movements as a function of probability. Results show that anticipatory pursuit velocity scaled with probability and followed direction expectation (attraction bias), whereas perceptual judgments were biased opposite to direction expectation (repulsion bias). Control experiments suggest that the repulsion bias in perception was not caused by retinal slip induced by anticipatory pursuit, or by motion adaptation. We conclude that direction expectation can be processed differently for perception and anticipatory pursuit.


2002 ◽  
Vol 88 (5) ◽  
pp. 2630-2647 ◽  
Author(s):  
Krishna V. Shenoy ◽  
James A. Crowell ◽  
Richard A. Andersen

When we move forward the visual images on our retinas expand. Humans rely on the focus, or center, of this expansion to estimate their direction of self-motion or heading and, as long as the eyes are still, the retinal focus corresponds to the heading. However, smooth pursuit eye movements add visual motion to the expanding retinal image and displace the focus of expansion. In spite of this, humans accurately judge their heading during pursuit eye movements even though the retinal focus no longer corresponds to the heading. Recent studies in macaque suggest that correction for pursuit may occur in the dorsal aspect of the medial superior temporal area (MSTd); neurons in this area are tuned to the retinal position of the focus and they modify their tuning to partially compensate for the focus shift caused by pursuit. However, the question remains whether these neurons shift focus tuning more at faster pursuit speeds, to compensate for the larger focus shifts created by faster pursuit. To investigate this question, we recorded from 40 MSTd neurons while monkeys made pursuit eye movements at a range of speeds across simulated self- or object motion displays. We found that most MSTd neurons modify their focus tuning more at faster pursuit speeds, consistent with the idea that they encode heading and other motion parameters regardless of pursuit speed. Across the population, the median rate of compensation increase with pursuit speed was 51% as great as required for perfect compensation. We recorded from the same neurons in a simulated pursuit condition, in which gaze was fixed but the entire display counter-rotated to produce the same retinal image as during real pursuit. This condition yielded the result that retinal cues contribute to pursuit compensation; the rate of compensation increase was 30% of that required for accurate encoding of heading. The difference between these two conditions was significant ( P < 0.05), indicating that extraretinal cues also contribute significantly. We found a systematic antialignment between preferred pursuit and preferred visual motion directions. Neurons may use this antialignment to combine retinal and extraretinal compensatory cues. These results indicate that many MSTd neurons compensate for pursuit velocity, pursuit direction as previously reported and pursuit speed, and further implicate MSTd as a critical stage in the computation of egomotion.


1994 ◽  
Vol 72 (1) ◽  
pp. 150-162 ◽  
Author(s):  
R. J. Krauzlis ◽  
S. G. Lisberger

1. Our goal was to assess whether visual motion signals related to changes in image velocity contribute to pursuit eye movements. We recorded the smooth eye movements evoked by ramp target motion at constant speed. In two different kinds of stimuli, the onset of target motion provided either an abrupt, step change in target velocity or a smooth target acceleration that lasted 125 ms followed by prolonged target motion at constant velocity. We measured the eye acceleration in the first 100 ms of pursuit. Because of the 100-ms latency from the onset of visual stimuli to the onset of smooth eye movement, the eye acceleration in this 100-ms interval provides an estimate of the open-loop response of the visuomotor pathways that drive pursuit. 2. For steps of target velocity, eye acceleration in the first 100 ms of pursuit depended on the “motion onset delay,” defined as the interval between the appearance of the target and the onset of motion. If the motion onset delay was > 100 ms, then the initial eye movement consisted of separable early and late phases of eye acceleration. The early phase dominated eye acceleration in the interval from 0 to 40 ms after pursuit onset and was relatively insensitive to image speed. The late phase dominated eye acceleration in the interval 40–100 ms after the onset of pursuit and had an amplitude that was proportional to image speed. If there was no delay between the appearance of the target and the onset of its motion, then the early component was not seen, and eye acceleration was related to target speed throughout the first 100 ms of pursuit. 3. For step changes of target velocity, the relationship between eye acceleration in the first 40 ms of pursuit and target velocity saturated at target speeds > 10 degrees /s. In contrast, the relationship was nearly linear when eye acceleration was measured in the interval 40–100 ms after the onset of pursuit. We suggest that the first 40 ms of pursuit are driven by a transient visual motion input that is related to the onset of target motion (motion onset transient component) and that the next 60 ms are driven by a sustained visual motion input (image velocity component). 4. When the target accelerated smoothly for 125 ms before moving at constant speed, the initiation of pursuit resembled that evoked by steps of target velocity. However, the latency of pursuit was consistently longer for smooth target accelerations than for steps of target velocity.(ABSTRACT TRUNCATED AT 400 WORDS)


2009 ◽  
Vol 101 (6) ◽  
pp. 3012-3030 ◽  
Author(s):  
Xin Huang ◽  
Stephen G. Lisberger

Smooth-pursuit eye movements are variable, even when the same tracking target motion is repeated many times. We asked whether variation in pursuit could arise from noise in the response of visual motion neurons in the middle temporal visual area (MT). In physiological experiments, we evaluated the mean, variance, and trial-by-trial correlation in the spike counts of pairs of simultaneously recorded MT neurons. The correlations between responses of pairs of MT neurons are highly significant and are stronger when the two neurons in a pair have similar preferred speeds, directions, or receptive field locations. Spike count correlation persists when the same exact stimulus form is repeatedly presented. Spike count correlations increase as the analysis window increases because of correlations in the responses of individual neurons across time. Spike count correlations are highest at speeds below the preferred speeds of the neuron pair and increase as the contrast of a square-wave grating is decreased. In computational analyses, we evaluated whether the correlations and variation across the population response in MT could drive the observed behavioral variation in pursuit direction and speed. We created model population responses that mimicked the mean and variance of MT neural responses as well as the observed structure and amplitude of noise correlations between pairs of neurons. A vector-averaging decoding computation revealed that the observed variation in pursuit could arise from the MT population response, without postulating other sources of motor variation.


2001 ◽  
Vol 18 (3) ◽  
pp. 365-376 ◽  
Author(s):  
RICHARD J. KRAUZLIS ◽  
SCOTT A. ADLER

Expectations about future motions can influence both perceptual judgements and pursuit eye movements. However, it is not known whether these two effects are due to shared processing, or to separate mechanisms with similar properties. We have addressed this question by providing subjects with prior information about the likely direction of motion in an upcoming random-dot motion display and measuring both the perceptual judgements and pursuit eye movements elicited by the stimulus. We quantified the subjects' responses by computing oculometric curves from their pursuit eye movements and psychometric curves from their perceptual decisions. Our results show that directional cues caused similar shifts in both the oculometric and psychometric curves toward the expected motion direction, with little change in the shapes of the curves. Prior information therefore biased the outcome of both eye movement and perceptual decisions without systematically changing their thresholds. We also found that eye movement and perceptual decisions tended to be the same on a trial-by-trial basis, at a higher frequency than would be expected by chance. Furthermore, the effects of prior information were evident during pursuit initiation, as well as during pursuit maintenance, indicating that prior information likely influenced the early processing of visual motion. We conclude that, in our experiments, expectations caused similar effects on both pursuit and perception by altering the activity of visual motion detectors that are read out by both the oculomotor and perceptual systems. Applying cognitive factors such as expectations at relatively early stages of visual processing could act to coordinate the metrics of eye movements with perceptual judgements.


2009 ◽  
Vol 101 (2) ◽  
pp. 934-947 ◽  
Author(s):  
Masafumi Ohki ◽  
Hiromasa Kitazawa ◽  
Takahito Hiramatsu ◽  
Kimitake Kaga ◽  
Taiko Kitamura ◽  
...  

The anatomical connection between the frontal eye field and the cerebellar hemispheric lobule VII (H-VII) suggests a potential role of the hemisphere in voluntary eye movement control. To reveal the involvement of the hemisphere in smooth pursuit and saccade control, we made a unilateral lesion around H-VII and examined its effects in three Macaca fuscata that were trained to pursue visually a small target. To the step (3°)-ramp (5–20°/s) target motion, the monkeys usually showed an initial pursuit eye movement at a latency of 80–140 ms and a small catch-up saccade at 140–220 ms that was followed by a postsaccadic pursuit eye movement that roughly matched the ramp target velocity. After unilateral cerebellar hemispheric lesioning, the initial pursuit eye movements were impaired, and the velocities of the postsaccadic pursuit eye movements decreased. The onsets of 5° visually guided saccades to the stationary target were delayed, and their amplitudes showed a tendency of increased trial-to-trial variability but never became hypo- or hypermetric. Similar tendencies were observed in the onsets and amplitudes of catch-up saccades. The adaptation of open-loop smooth pursuit velocity, tested by a step increase in target velocity for a brief period, was impaired. These lesion effects were recognized in all directions, particularly in the ipsiversive direction. A recovery was observed at 4 wk postlesion for some of these lesion effects. These results suggest that the cerebellar hemispheric region around lobule VII is involved in the control of smooth pursuit and saccadic eye movements.


2009 ◽  
Vol 102 (4) ◽  
pp. 2013-2025 ◽  
Author(s):  
Leslie C. Osborne ◽  
Stephen G. Lisberger

To probe how the brain integrates visual motion signals to guide behavior, we analyzed the smooth pursuit eye movements evoked by target motion with a stochastic component. When each dot of a texture executed an independent random walk such that speed or direction varied across the spatial extent of the target, pursuit variance increased as a function of the variance of visual pattern motion. Noise in either target direction or speed increased the variance of both eye speed and direction, implying a common neural noise source for estimating target speed and direction. Spatial averaging was inefficient for targets with >20 dots. Together these data suggest that pursuit performance is limited by the properties of spatial averaging across a noisy population of sensory neurons rather than across the physical stimulus. When targets executed a spatially uniform random walk in time around a central direction of motion, an optimized linear filter that describes the transformation of target motion into eye motion accounted for ∼50% of the variance in pursuit. Filters had widths of ∼25 ms, much longer than the impulse response of the eye, and filter shape depended on both the range and correlation time of motion signals, suggesting that filters were products of sensory processing. By quantifying the effects of different levels of stimulus noise on pursuit, we have provided rigorous constraints for understanding sensory population decoding. We have shown how temporal and spatial integration of sensory signals converts noisy population responses into precise motor responses.


Author(s):  
Tianyi Yan ◽  
Jinglong Wu

In humans, functional imaging studies have found a homolog of the macaque motion complex, MT+, which is suggested to contain both the middle temporal (MT) and medial superior temporal (MST) areas in the ascending limb of the inferior temporal sulcus. In the macaque, the motion-sensitive MT and MST areas are adjacent in the superior temporal sulcus. Electrophysiology has identified several motion-selective regions in the superior temporal sulcus (STS) of the macaque. Two of the best-studied areas include the MT and MST areas. The MT area has strong projections to the adjacent MST area and is typically subdivided into the dorsal (MSTd) and lateral (MSTl) subregions. While MT encodes the basic elements of motion, MST has higher-order motion-processing abilities and has been implicated in the perception of both object motion and self motion. The macaque MST area has been shown to have considerably larger receptive fields than the MT area. The receptive fields of MT cells typically extend only a few degrees into the ipsilateral visual field, while MST neurons have receptive fields that extend well into the ipsilateral visual field. This study tentatively identifies these subregions as the human homologs of the macaque MT and MST areas, respectively (Fig. 1). Putative human MT and MST areas were typically located on the posterior/ventral and anterior/dorsal banks of a dorsal/posterior limb of the inferior temporal sulcus. These locations are similar to their relative positions in the macaque superior temporal sulcus.


Sign in / Sign up

Export Citation Format

Share Document