Object Motion Computation for the Initiation of Smooth Pursuit Eye Movements in Humans

2005 ◽  
Vol 93 (4) ◽  
pp. 2279-2293 ◽  
Author(s):  
Julian M. Wallace ◽  
Leland S. Stone ◽  
Guillaume S. Masson

Pursuing an object with smooth eye movements requires an accurate estimate of its two-dimensional (2D) trajectory. This 2D motion computation requires that different local motion measurements are extracted and combined to recover the global object-motion direction and speed. Several combination rules have been proposed such as vector averaging (VA), intersection of constraints (IOC), or 2D feature tracking (2DFT). To examine this computation, we investigated the time course of smooth pursuit eye movements driven by simple objects of different shapes. For type II diamond (where the direction of true object motion is dramatically different from the vector average of the 1-dimensional edge motions, i.e., VA ≠ IOC = 2DFT), the ocular tracking is initiated in the vector average direction. Over a period of less than 300 ms, the eye-tracking direction converges on the true object motion. The reduction of the tracking error starts before the closing of the oculomotor loop. For type I diamonds (where the direction of true object motion is identical to the vector average direction, i.e., VA = IOC = 2DFT), there is no such bias. We quantified this effect by calculating the direction error between responses to types I and II and measuring its maximum value and time constant. At low contrast and high speeds, the initial bias in tracking direction is larger and takes longer to converge onto the actual object-motion direction. This effect is attenuated with the introduction of more 2D information to the extent that it was totally obliterated with a texture-filled type II diamond. These results suggest a flexible 2D computation for motion integration, which combines all available one-dimensional (edge) and 2D (feature) motion information to refine the estimate of object-motion direction over time.

2006 ◽  
Vol 96 (6) ◽  
pp. 3545-3550 ◽  
Author(s):  
Anna Montagnini ◽  
Miriam Spering ◽  
Guillaume S. Masson

Smooth pursuit eye movements reflect the temporal dynamics of bidimensional (2D) visual motion integration. When tracking a single, tilted line, initial pursuit direction is biased toward unidimensional (1D) edge motion signals, which are orthogonal to the line orientation. Over 200 ms, tracking direction is slowly corrected to finally match the 2D object motion during steady-state pursuit. We now show that repetition of line orientation and/or motion direction does not eliminate the transient tracking direction error nor change the time course of pursuit correction. Nonetheless, multiple successive presentations of a single orientation/direction condition elicit robust anticipatory pursuit eye movements that always go in the 2D object motion direction not the 1D edge motion direction. These results demonstrate that predictive signals about target motion cannot be used for an efficient integration of ambiguous velocity signals at pursuit initiation.


2007 ◽  
Vol 97 (2) ◽  
pp. 1353-1367 ◽  
Author(s):  
Miriam Spering ◽  
Karl R. Gegenfurtner

Segregating a moving object from its visual context is particularly relevant for the control of smooth-pursuit eye movements. We examined the interaction between a moving object and a stationary or moving visual context to determine the role of the context motion signal in driving pursuit. Eye movements were recorded from human observers to a medium-contrast Gaussian dot that moved horizontally at constant velocity. A peripheral context consisted of two vertically oriented sinusoidal gratings, one above and one below the stimulus trajectory, that were either stationary or drifted into the same or opposite direction as that of the target at different velocities. We found that a stationary context impaired pursuit acceleration and velocity and prolonged pursuit latency. A drifting context enhanced pursuit performance, irrespective of its motion direction. This effect was modulated by context contrast and orientation. When a context was briefly perturbed to move faster or slower eye velocity changed accordingly, but only when the context was drifting along with the target. Perturbing a context into the direction orthogonal to target motion evoked a deviation of the eye opposite to the perturbation direction. We therefore provide evidence for the use of absolute and relative motion cues, or motion assimilation and motion contrast, for the control of smooth-pursuit eye movements.


2005 ◽  
Vol 93 (6) ◽  
pp. 3418-3433 ◽  
Author(s):  
Hui Meng ◽  
Andrea M. Green ◽  
J. David Dickman ◽  
Dora E. Angelaki

Under natural conditions, the vestibular and pursuit systems work synergistically to stabilize the visual scene during movement. How translational vestibular signals [translational vestibuloocular reflex (TVOR)] are processed in the premotor pathways for slow eye movements continues to remain a challenging question. To further our understanding of how premotor neurons contribute to this processing, we recorded neural activities from the prepositus and rostral medial vestibular nuclei in macaque monkeys. Vestibular neurons were tested during 0.5-Hz rotation and lateral translation (both with gaze stable and during VOR cancellation tasks), as well as during smooth pursuit eye movements. Data were collected at two different viewing distances, 80 and 20 cm. Based on their responses to rotation and pursuit, eye-movement–sensitive neurons were classified into position–vestibular–pause (PVP) neurons, eye–head (EH) neurons, and burst–tonic (BT) cells. We found that approximately half of the type II PVP and EH neurons with ipsilateral eye movement preference were modulated during TVOR cancellation. In contrast, few of the EH and none of the type I PVP cells with contralateral eye movement preference modulated during translation in the absence of eye movements; nor did any of the BT neurons change their firing rates during TVOR cancellation. Of the type II PVP and EH neurons that modulated during TVOR cancellation, cell firing rates increased for either ipsilateral or contralateral displacement, a property that could not be predicted on the basis of their rotational or pursuit responses. In contrast, under stable gaze conditions, all neuron types, including EH cells, were modulated during translation according to their ipsilateral/contralateral preference for pursuit eye movements. Differences in translational response sensitivities for far versus near targets were seen only in type II PVP and EH cells. There was no effect of viewing distance on response phase for any cell type. When expressed relative to motor output, neural sensitivities during translation (although not during rotation) and pursuit were equivalent, particularly for the 20-cm viewing distance. These results suggest that neural activities during the TVOR were more motorlike compared with cell responses during the rotational vestibuloocular reflex (RVOR). We also found that neural responses under stable gaze conditions could not always be predicted by a linear vectorial addition of the cell activities during pursuit and VOR cancellation. The departure from linearity was more pronounced for the TVOR under near-viewing conditions. These results extend previous observations for the neural processing of otolith signals within the premotor circuitry that generates the RVOR and smooth pursuit eye movements.


2011 ◽  
Vol 105 (4) ◽  
pp. 1756-1767 ◽  
Author(s):  
Miriam Spering ◽  
Alexander C. Schütz ◽  
Doris I. Braun ◽  
Karl R. Gegenfurtner

Success of motor behavior often depends on the ability to predict the path of moving objects. Here we asked whether tracking a visual object with smooth pursuit eye movements helps to predict its motion direction. We developed a paradigm, “eye soccer,” in which observers had to either track or fixate a visual target (ball) and judge whether it would have hit or missed a stationary vertical line segment (goal). Ball and goal were presented briefly for 100–500 ms and disappeared from the screen together before the perceptual judgment was prompted. In pursuit conditions, the ball moved towards the goal; in fixation conditions, the goal moved towards the stationary ball, resulting in similar retinal stimulation during pursuit and fixation. We also tested the condition in which the goal was fixated and the ball moved. Motion direction prediction was significantly better in pursuit than in fixation trials, regardless of whether ball or goal served as fixation target. In both fixation and pursuit trials, prediction performance was better when eye movements were accurate. Performance also increased with shorter ball-goal distance and longer presentation duration. A longer trajectory did not affect performance. During pursuit, an efference copy signal might provide additional motion information, leading to the advantage in motion prediction.


2007 ◽  
Vol 98 (3) ◽  
pp. 1355-1363 ◽  
Author(s):  
Miriam Spering ◽  
Karl R. Gegenfurtner

The analysis of visual motion serves many different functions ranging from object motion perception to the control of self-motion. The perception of visual motion and the oculomotor tracking of a moving object are known to be closely related and are assumed to be controlled by shared brain areas. We compared perceived velocity and the velocity of smooth pursuit eye movements in human observers in a paradigm that required the segmentation of target object motion from context motion. In each trial, a pursuit target and a visual context were independently perturbed simultaneously to briefly increase or decrease in speed. Observers had to accurately track the target and estimate target speed during the perturbation interval. Here we show that the same motion signals are processed in fundamentally different ways for perception and steady-state smooth pursuit eye movements. For the computation of perceived velocity, motion of the context was subtracted from target motion (motion contrast), whereas pursuit velocity was determined by the motion average (motion assimilation). We conclude that the human motion system uses these computations to optimally accomplish different functions: image segmentation for object motion perception and velocity estimation for the control of smooth pursuit eye movements.


2005 ◽  
Vol 164 (3) ◽  
pp. 376-386 ◽  
Author(s):  
Jan L. Souman ◽  
Ignace Th. C. Hooge ◽  
Alexander H. Wertheim

Sign in / Sign up

Export Citation Format

Share Document