scholarly journals Context effects on smooth pursuit and manual interception of a disappearing target

2017 ◽  
Vol 118 (1) ◽  
pp. 404-415 ◽  
Author(s):  
Philipp Kreyenmeier ◽  
Jolande Fooken ◽  
Miriam Spering

In our natural environment, we interact with moving objects that are surrounded by richly textured, dynamic visual contexts. Yet most laboratory studies on vision and movement show visual objects in front of uniform gray backgrounds. Context effects on eye movements have been widely studied, but it is less well known how visual contexts affect hand movements. Here we ask whether eye and hand movements integrate motion signals from target and context similarly or differently, and whether context effects on eye and hand change over time. We developed a track-intercept task requiring participants to track the initial launch of a moving object (“ball”) with smooth pursuit eye movements. The ball disappeared after a brief presentation, and participants had to intercept it in a designated “hit zone.” In two experiments ( n = 18 human observers each), the ball was shown in front of a uniform or a textured background that either was stationary or moved along with the target. Eye and hand movement latencies and speeds were similarly affected by the visual context, but eye and hand interception (eye position at time of interception, and hand interception timing error) did not differ significantly between context conditions. Eye and hand interception timing errors were strongly correlated on a trial-by-trial basis across all context conditions, highlighting the close relation between these responses in manual interception tasks. Our results indicate that visual contexts similarly affect eye and hand movements but that these effects may be short-lasting, affecting movement trajectories more than movement end points. NEW & NOTEWORTHY In a novel track-intercept paradigm, human observers tracked a briefly shown object moving across a textured, dynamic context and intercepted it with their finger after it had disappeared. Context motion significantly affected eye and hand movement latency and speed, but not interception accuracy; eye and hand position at interception were correlated on a trial-by-trial basis. Visual context effects may be short-lasting, affecting movement trajectories more than movement end points.

2007 ◽  
Vol 97 (1) ◽  
pp. 761-771 ◽  
Author(s):  
Uwe J. Ilg ◽  
Stefan Schumann

The contributions of the middle superior temporal area (MST) in the posterior parietal cortex of rhesus monkeys to the generation of smooth-pursuit eye movements as well as the contributions to motion perception are well established. Here, we present the first experimental evidence that this area also contributes to the generation of goal-directed hand movements toward a moving target. This evidence is based on the outcome of intracortical microstimulation experiments and transient lesions by small injections of muscimol at identified sites within the lateral part of area MST (MST-l). When microstimulation was applied during the execution of smooth-pursuit eye movements, postsaccadic eye velocity in the direction of the preferred direction of the stimulated site increased significantly (in 93 of 136 sites tested). When microstimulation was applied during a hand movement trial, the hand movement was displaced significantly in the same direction (in 28 of 39 sites tested). When we lesioned area MST-l transiently by injections of muscimol, steady-state eye velocity was exclusively reduced for ipsiversive smooth-pursuit eye movements. In contrast, hand movements were displaced toward the contralateral side, irrespective of the direction of the moving target. Our results provide evidence that area MST-l is involved in the processing of moving targets and plays a role in the execution of smooth-pursuit eye movements as well as visually guided hand movements.


2020 ◽  
Vol 123 (4) ◽  
pp. 1439-1447
Author(s):  
Jolande Fooken ◽  
Miriam Spering

Real-world tasks, such as avoiding obstacles, require a sequence of interdependent choices to reach accurate motor actions. Yet, most studies on primate decision making involve simple one-step choices. Here we analyze motor actions to investigate how sensorimotor decisions develop over time. In a go/no-go interception task human observers ( n = 42) judged whether a briefly presented moving target would pass (interceptive hand movement required) or miss (no hand movement required) a strike box while their eye and hand movements were recorded. Go/no-go decision formation had to occur within the first few hundred milliseconds to allow time-critical interception. We found that the earliest time point at which eye movements started to differentiate actions (go versus no-go) preceded hand movement onset. Moreover, eye movements were related to different stages of decision making. Whereas higher eye velocity during smooth pursuit initiation was related to more accurate interception decisions (whether or not to act), faster pursuit maintenance was associated with more accurate timing decisions (when to act). These results indicate that pursuit initiation and maintenance are continuously linked to ongoing sensorimotor decision formation. NEW & NOTEWORTHY Here we show that eye movements are a continuous indicator of decision processes underlying go/no-go actions. We link different stages of decision formation to distinct oculomotor events during open- and closed-loop smooth pursuit. Critically, the earliest time point at which eye movements differentiate actions preceded hand movement onset, suggesting shared sensorimotor processing for eye and hand movements. These results emphasize the potential of studying eye movements as a readout of cognitive processes.


2019 ◽  
Vol 121 (5) ◽  
pp. 1967-1976 ◽  
Author(s):  
Niels Gouirand ◽  
James Mathew ◽  
Eli Brenner ◽  
Frederic R. Danion

Adapting hand movements to changes in our body or the environment is essential for skilled motor behavior. Although eye movements are known to assist hand movement control, how eye movements might contribute to the adaptation of hand movements remains largely unexplored. To determine to what extent eye movements contribute to visuomotor adaptation of hand tracking, participants were asked to track a visual target that followed an unpredictable trajectory with a cursor using a joystick. During blocks of trials, participants were either allowed to look wherever they liked or required to fixate a cross at the center of the screen. Eye movements were tracked to ensure gaze fixation as well as to examine free gaze behavior. The cursor initially responded normally to the joystick, but after several trials, the direction in which it responded was rotated by 90°. Although fixating the eyes had a detrimental influence on hand tracking performance, participants exhibited a rather similar time course of adaptation to rotated visual feedback in the gaze-fixed and gaze-free conditions. More importantly, there was extensive transfer of adaptation between the gaze-fixed and gaze-free conditions. We conclude that although eye movements are relevant for the online control of hand tracking, they do not play an important role in the visuomotor adaptation of such tracking. These results suggest that participants do not adapt by changing the mapping between eye and hand movements, but rather by changing the mapping between hand movements and the cursor’s motion independently of eye movements. NEW & NOTEWORTHY Eye movements assist hand movements in everyday activities, but their contribution to visuomotor adaptation remains largely unknown. We compared adaptation of hand tracking under free gaze and fixed gaze. Although our results confirm that following the target with the eyes increases the accuracy of hand movements, they unexpectedly demonstrate that gaze fixation does not hinder adaptation. These results suggest that eye movements have distinct contributions for online control and visuomotor adaptation of hand movements.


2020 ◽  
Vol 7 (1) ◽  
Author(s):  
John-Ross Rizzo ◽  
Mahya Beheshti ◽  
Tahereh Naeimi ◽  
Farnia Feiz ◽  
Girish Fatterpekar ◽  
...  

Abstract Background Eye–hand coordination (EHC) is a sophisticated act that requires interconnected processes governing synchronization of ocular and manual motor systems. Precise, timely and skillful movements such as reaching for and grasping small objects depend on the acquisition of high-quality visual information about the environment and simultaneous eye and hand control. Multiple areas in the brainstem and cerebellum, as well as some frontal and parietal structures, have critical roles in the control of eye movements and their coordination with the head. Although both cortex and cerebellum contribute critical elements to normal eye-hand function, differences in these contributions suggest that there may be separable deficits following injury. Method As a preliminary assessment for this perspective, we compared eye and hand-movement control in a patient with cortical stroke relative to a patient with cerebellar stroke. Result We found the onset of eye and hand movements to be temporally decoupled, with significant decoupling variance in the patient with cerebellar stroke. In contrast, the patient with cortical stroke displayed increased hand spatial errors and less significant temporal decoupling variance. Increased decoupling variance in the patient with cerebellar stroke was primarily due to unstable timing of rapid eye movements, saccades. Conclusion These findings highlight a perspective in which facets of eye-hand dyscoordination are dependent on lesion location and may or may not cooperate to varying degrees. Broadly speaking, the results corroborate the general notion that the cerebellum is instrumental to the process of temporal prediction for eye and hand movements, while the cortex is instrumental to the process of spatial prediction, both of which are critical aspects of functional movement control.


2017 ◽  
Author(s):  
Sumitash Jana ◽  
Aditya Murthy

Eye and hand movements are often made in isolation but for reaching movements they are usually coupled together. While previous studies have demonstrated aspects of both kinematic and spatial coupling between eye and hand, few studies have investigated the influence of saccades on shaping a more complex curved hand movement trajectory profile. Here, using a novel obstacle avoidance task where the obstacle appeared in an infrequent number of trials, we try to establish the link between the saccade and hand trajectory. In the first part of the paper, we illustrate that the hand trajectory direction is influenced by the end location of the saccade, despite little temporal coupling between the two effectors. The x-position of the saccade end-point was related to whether the hand trajectory followed a straight or a curved path, while the y-position of the saccade end-point was related to whether the hand took a path passing from over or below the obstacle. In the second part of the paper, we establish the link between the saccade locations and hand sub-movements and observed that the number and timing of saccades and number of hand velocity peaks were related. Taken together these results indicate that saccades can influence complex hand movement trajectories.


2007 ◽  
Vol 97 (2) ◽  
pp. 1353-1367 ◽  
Author(s):  
Miriam Spering ◽  
Karl R. Gegenfurtner

Segregating a moving object from its visual context is particularly relevant for the control of smooth-pursuit eye movements. We examined the interaction between a moving object and a stationary or moving visual context to determine the role of the context motion signal in driving pursuit. Eye movements were recorded from human observers to a medium-contrast Gaussian dot that moved horizontally at constant velocity. A peripheral context consisted of two vertically oriented sinusoidal gratings, one above and one below the stimulus trajectory, that were either stationary or drifted into the same or opposite direction as that of the target at different velocities. We found that a stationary context impaired pursuit acceleration and velocity and prolonged pursuit latency. A drifting context enhanced pursuit performance, irrespective of its motion direction. This effect was modulated by context contrast and orientation. When a context was briefly perturbed to move faster or slower eye velocity changed accordingly, but only when the context was drifting along with the target. Perturbing a context into the direction orthogonal to target motion evoked a deviation of the eye opposite to the perturbation direction. We therefore provide evidence for the use of absolute and relative motion cues, or motion assimilation and motion contrast, for the control of smooth-pursuit eye movements.


2020 ◽  
Author(s):  
Friedemann Bunjes ◽  
Peter Thier

SummaryAlthough animal research and some rare human case reports suggest that lesions of the dorsal pons yield saccadic and smooth pursuit eye movement deficits, little is known about the functional topology of the human pontine nuclei (PN) and whether limb movements are similarly affected as eye movements. Saccadic as well as SP eye and pointing movements were measured in six patients with lesions in the PN region. Five patients of the six exhibited dysmetric saccades, whilst smooth pursuit gain was reduced in four. Pontine lesions also alter the relationship between amplitude, velocity, and velocity skewness of saccadic eye movements. Limb movement trajectories were more curved in four patients. The results suggest that the lesions impair a general calibration mechanism that uses the parallel fiber-Purkinjecell synapse in the cerebellar cortex to adjust the timing of muscle innervation in visually guided oculomotor as well as limb movement tasks.


Leonardo ◽  
2001 ◽  
Vol 34 (1) ◽  
pp. 35-40 ◽  
Author(s):  
R.C. Miall ◽  
John Tchalenko

The mental processes that al-low an artist to transform visual images-e.g. those of his model-into a picture on the canvas are not easily studied. The authors re-port work measuring the eye and hand movements of a single artist, chosen for his detailed and realis-tic portraits produced from life. His eye fixations when painting or drawing were of twice the duration of those when he was not painting and also quite different from those of novice artists. His eye-hand co-ordination pattern also showed dif-ferences from that of novices, be-ing more temporally consistent. This preliminary work suggests that detailed and quantitative analy-sis of a working artist is feasible and will illuminate the process of artistic creation.


2008 ◽  
Vol 100 (3) ◽  
pp. 1533-1543 ◽  
Author(s):  
J. Randall Flanagan ◽  
Yasuo Terao ◽  
Roland S. Johansson

People naturally direct their gaze to visible hand movement goals. Doing so improves reach accuracy through use of signals related to gaze position and visual feedback of the hand. Here, we studied where people naturally look when acting on remembered target locations. Four targets were presented on a screen, in peripheral vision, while participants fixed a central cross (encoding phase). Four seconds later, participants used a pen to mark the remembered locations while free to look wherever they wished (recall phase). Visual references, including the screen and the cross, were present throughout. During recall, participants neither looked at the marked locations nor prevented eye movements. Instead, gaze behavior was erratic and was comprised of gaze shifts loosely coupled in time and space with hand movements. To examine whether eye and hand movements during encoding affected gaze behavior during recall, in additional encoding conditions, participants marked the visible targets with either free gaze or with central cross fixation or just looked at the targets. All encoding conditions yielded similar erratic gaze behavior during recall. Furthermore, encoding mode did not influence recall performance, suggesting that participants, during recall, did not exploit sensorimotor memories related to hand and gaze movements during encoding. Finally, we recorded a similar lose coupling between hand and eye movements during an object manipulation task performed in darkness after participants had viewed the task environment. We conclude that acting on remembered versus visible targets can engage fundamentally different control strategies, with gaze largely decoupled from movement goals during memory-guided actions.


2003 ◽  
Vol 9 (1) ◽  
pp. 44-54 ◽  
Author(s):  
P Feys ◽  
W F Helsen ◽  
A Lavrysen ◽  
B Nuttin ◽  
P Ketelaer

A ccurate goal-directed movements toward a visual target require a precise coordination of both the oculomotor and limb motor systems. Intentio n tremor and eye movement deficits are frequently observed in multiple sclerosis (MS). The goal of this study was to examine the characteristics of intentio n tremor and simultaneously produced eye movements during rapid goal-directed movements. Eye and hand movements were synchronously measured in 16 MS patients with intentio n tremor and 16 control subjects. Manual performances of the patient group were character ized by a delayed onset, slower executio n and aiming inaccuracies. In line with the clinically defined picture of intention tremor, differences between patients and control subjects were most pronounced toward the end of the movement. Dependent variables were obviously greater in MS patients compared with control subjects, and correlated well with clinical outcome measures. The application of an inertial load to the limb did not show any effect on intention tremor. In addition to impaired limb coordination, evidence has been found that eye movements, too, were abnormal in patients compared with control subjects. Moreover, eye and hand movement deficits seemed to be closely related, suggesting a common underlying command structure. Inaccurate eye movements were likely to hamper an accurate motor performance of the hand.


Sign in / Sign up

Export Citation Format

Share Document