scholarly journals A common control signal and a ballistic stage can explain the control of coordinated eye-hand movements

2016 ◽  
Vol 115 (5) ◽  
pp. 2470-2484 ◽  
Author(s):  
Atul Gopal ◽  
Aditya Murthy

Voluntary control has been extensively studied in the context of eye and hand movements made in isolation, yet little is known about the nature of control during eye-hand coordination. We probed this with a redirect task. Here subjects had to make reaching/pointing movements accompanied by coordinated eye movements but had to change their plans when the target occasionally changed its position during some trials. Using a race model framework, we found that separate effector-specific mechanisms may be recruited to control eye and hand movements when executed in isolation but when the same effectors are coordinated a unitary mechanism to control coordinated eye-hand movements is employed. Specifically, we found that performance curves were distinct for the eye and hand when these movements were executed in isolation but were comparable when they were executed together. Second, the time to switch motor plans, called the target step reaction time, was different in the eye-alone and hand-alone conditions but was similar in the coordinated condition under assumption of a ballistic stage of ∼40 ms, on average. Interestingly, the existence of this ballistic stage could predict the extent of eye-hand dissociations seen in individual subjects. Finally, when subjects were explicitly instructed to control specifically a single effector (eye or hand), redirecting one effector had a strong effect on the performance of the other effector. Taken together, these results suggest that a common control signal and a ballistic stage are recruited when coordinated eye-hand movement plans require alteration.

2018 ◽  
Vol 120 (3) ◽  
pp. 1293-1306
Author(s):  
Prasanna Venkhatesh Venkataratamani ◽  
Aditya Murthy

Previous studies have investigated the computational architecture underlying the voluntary control of reach movements that demands a change in position or direction of movement planning. Here we used a novel task in which subjects had to either increase or decrease the movement speed according to a change in target color that occurred randomly during a trial. The applicability of different race models to such a speed redirect task was assessed. We found that the predictions of an independent race model that instantiated an abort-and-replan strategy was consistent with all aspects of performance in the fast-to-slow speed condition. The results from modeling indicated a peculiar asymmetry, in that although the fast-to-slow speed change required inhibition, none of the standard race models was able to explain how movements changed from slow to fast speeds. Interestingly, a weighted averaging model that simulated the gradual merging of two kinematic plans explained behavior in the slow-to-fast speed task. In summary, our work shows how a race model framework can provide an understanding of how the brain controls different aspects of reach movement planning and help distinguish between an abort-and-replan strategy and merging of plans. NEW & NOTEWORTHY For the first time, a race model framework was used to understand how reach speeds are modified. We provide evidence that a fast-to-slow speed change required aborting the current plan and a complete respecification of a new plan, while none of the race models was able to explain an instructed increase of hand movement speed, which was instead accomplished by a merging of a new kinematic plan with the existing kinematic plan.


2020 ◽  
Vol 7 (1) ◽  
Author(s):  
John-Ross Rizzo ◽  
Mahya Beheshti ◽  
Tahereh Naeimi ◽  
Farnia Feiz ◽  
Girish Fatterpekar ◽  
...  

Abstract Background Eye–hand coordination (EHC) is a sophisticated act that requires interconnected processes governing synchronization of ocular and manual motor systems. Precise, timely and skillful movements such as reaching for and grasping small objects depend on the acquisition of high-quality visual information about the environment and simultaneous eye and hand control. Multiple areas in the brainstem and cerebellum, as well as some frontal and parietal structures, have critical roles in the control of eye movements and their coordination with the head. Although both cortex and cerebellum contribute critical elements to normal eye-hand function, differences in these contributions suggest that there may be separable deficits following injury. Method As a preliminary assessment for this perspective, we compared eye and hand-movement control in a patient with cortical stroke relative to a patient with cerebellar stroke. Result We found the onset of eye and hand movements to be temporally decoupled, with significant decoupling variance in the patient with cerebellar stroke. In contrast, the patient with cortical stroke displayed increased hand spatial errors and less significant temporal decoupling variance. Increased decoupling variance in the patient with cerebellar stroke was primarily due to unstable timing of rapid eye movements, saccades. Conclusion These findings highlight a perspective in which facets of eye-hand dyscoordination are dependent on lesion location and may or may not cooperate to varying degrees. Broadly speaking, the results corroborate the general notion that the cerebellum is instrumental to the process of temporal prediction for eye and hand movements, while the cortex is instrumental to the process of spatial prediction, both of which are critical aspects of functional movement control.


Motor Control ◽  
2016 ◽  
Vol 20 (3) ◽  
pp. 316-336 ◽  
Author(s):  
Uta Sailer ◽  
Florian Güldenpfennig ◽  
Thomas Eggert

This study investigated the effect of hand movements on behavioral and electro-physiological parameters of saccade preparation. While event-related potentials were recorded in 17 subjects, they performed saccades to a visual target either together with a hand movement in the same direction, a hand movement in the opposite direction, a hand movement to a third, independent direction, or without any accompanying hand movements. Saccade latencies increased with any kind of accompanying hand movement. Both saccade and manual latencies were largest when both movements aimed at opposite directions. In contrast, saccade-related potentials indicating preparatory activity were mainly affected by hand movements in the same direction. The data suggest that concomitant hand movements interfere with saccade preparation, particularly when the two movements involve motor preparations that access the same visual stimulus. This indicates that saccade preparation is continually informed about hand movement preparation.


2018 ◽  
Vol 11 (6) ◽  
Author(s):  
Damla Topalli ◽  
Nergiz Ercil Cagiltay

Endoscopic surgery procedures require specific skills, such as eye-hand coordination to be developed. Current education programs are facing with problems to provide appropriate skill improvement and assessment methods in this field. This study aims to propose objective metrics for hand-movement skills and assess eye-hand coordination. An experimental study is conducted with 15 surgical residents to test the newly proposed measures. Two computer-based both-handed endoscopic surgery practice scenarios are developed in a simulation environment to gather the participants’ eye-gaze data with the help of an eye tracker as well as the related hand movement data through haptic interfaces. Additionally, participants’ eye-hand coordination skills are analyzed. The results indicate higher correlations in the intermediates’ eye-hand movements compared to the novices. An increase in intermediates’ visual concentration leads to smoother hand movements. Similarly, the novices’ hand movements are shown to remain at a standstill. After the first round of practice, all participants’ eye-hand coordination skills are improved on the specific task targeted in this study. According to these results, it can be concluded that the proposed metrics can potentially provide some additional insights about trainees’ eye-hand coordination skills and help instructional system designers to better address training requirements.


2017 ◽  
Author(s):  
Sumitash Jana ◽  
Aditya Murthy

Eye and hand movements are often made in isolation but for reaching movements they are usually coupled together. While previous studies have demonstrated aspects of both kinematic and spatial coupling between eye and hand, few studies have investigated the influence of saccades on shaping a more complex curved hand movement trajectory profile. Here, using a novel obstacle avoidance task where the obstacle appeared in an infrequent number of trials, we try to establish the link between the saccade and hand trajectory. In the first part of the paper, we illustrate that the hand trajectory direction is influenced by the end location of the saccade, despite little temporal coupling between the two effectors. The x-position of the saccade end-point was related to whether the hand trajectory followed a straight or a curved path, while the y-position of the saccade end-point was related to whether the hand took a path passing from over or below the obstacle. In the second part of the paper, we establish the link between the saccade locations and hand sub-movements and observed that the number and timing of saccades and number of hand velocity peaks were related. Taken together these results indicate that saccades can influence complex hand movement trajectories.


2015 ◽  
Vol 114 (3) ◽  
pp. 1438-1454 ◽  
Author(s):  
Atul Gopal ◽  
Aditya Murthy

Many studies of reaching and pointing have shown significant spatial and temporal correlations between eye and hand movements. Nevertheless, it remains unclear whether these correlations are incidental, arising from common inputs (independent model); whether these correlations represent an interaction between otherwise independent eye and hand systems (interactive model); or whether these correlations arise from a single dedicated eye-hand system (common command model). Subjects were instructed to redirect gaze and pointing movements in a double-step task in an attempt to decouple eye-hand movements and causally distinguish between the three architectures. We used a drift-diffusion framework in the context of a race model, which has been previously used to explain redirect behavior for eye and hand movements separately, to predict the pattern of eye-hand decoupling. We found that the common command architecture could best explain the observed frequency of different eye and hand response patterns to the target step. A common stochastic accumulator for eye-hand coordination also predicts comparable variances, despite significant difference in the means of the eye and hand reaction time (RT) distributions, which we tested. Consistent with this prediction, we observed that the variances of the eye and hand RTs were similar, despite much larger hand RTs (∼90 ms). Moreover, changes in mean eye RTs, which also increased eye RT variance, produced a similar increase in mean and variance of the associated hand RT. Taken together, these data suggest that a dedicated circuit underlies coordinated eye-hand planning.


2020 ◽  
Vol 132 (5) ◽  
pp. 1358-1366
Author(s):  
Chao-Hung Kuo ◽  
Timothy M. Blakely ◽  
Jeremiah D. Wander ◽  
Devapratim Sarma ◽  
Jing Wu ◽  
...  

OBJECTIVEThe activation of the sensorimotor cortex as measured by electrocorticographic (ECoG) signals has been correlated with contralateral hand movements in humans, as precisely as the level of individual digits. However, the relationship between individual and multiple synergistic finger movements and the neural signal as detected by ECoG has not been fully explored. The authors used intraoperative high-resolution micro-ECoG (µECoG) on the sensorimotor cortex to link neural signals to finger movements across several context-specific motor tasks.METHODSThree neurosurgical patients with cortical lesions over eloquent regions participated. During awake craniotomy, a sensorimotor cortex area of hand movement was localized by high-frequency responses measured by an 8 × 8 µECoG grid of 3-mm interelectrode spacing. Patients performed a flexion movement of the thumb or index finger, or a pinch movement of both, based on a visual cue. High-gamma (HG; 70–230 Hz) filtered µECoG was used to identify dominant electrodes associated with thumb and index movement. Hand movements were recorded by a dataglove simultaneously with µECoG recording.RESULTSIn all 3 patients, the electrodes controlling thumb and index finger movements were identifiable approximately 3–6-mm apart by the HG-filtered µECoG signal. For HG power of cortical activation measured with µECoG, the thumb and index signals in the pinch movement were similar to those observed during thumb-only and index-only movement, respectively (all p > 0.05). Index finger movements, measured by the dataglove joint angles, were similar in both the index-only and pinch movements (p > 0.05). However, despite similar activation across the conditions, markedly decreased thumb movement was observed in pinch relative to independent thumb-only movement (all p < 0.05).CONCLUSIONSHG-filtered µECoG signals effectively identify dominant regions associated with thumb and index finger movement. For pinch, the µECoG signal comprises a combination of the signals from individual thumb and index movements. However, while the relationship between the index finger joint angle and HG-filtered signal remains consistent between conditions, there is not a fixed relationship for thumb movement. Although the HG-filtered µECoG signal is similar in both thumb-only and pinch conditions, the actual thumb movement is markedly smaller in the pinch condition than in the thumb-only condition. This implies a nonlinear relationship between the cortical signal and the motor output for some, but importantly not all, movement types. This analysis provides insight into the tuning of the motor cortex toward specific types of motor behaviors.


1979 ◽  
Vol 48 (1) ◽  
pp. 207-214 ◽  
Author(s):  
Luis R. Marcos

16 subordinate bilingual subjects produced 5-min. monologues in their nondominant languages, i.e., English or Spanish. Hand-movement activity manifested during the videotape monologues was scored and related to measures of fluency in the nondominant language. The hand-movement behavior categorized as Groping Movement was significantly related to all of the nondominant-language fluency measures. These correlations support the assumption that Groping Movement may have a function in the process of verbal encoding. The results are discussed in terms of the possibility of monitoring central cognitive processes through the study of “visible” motor behavior.


2018 ◽  
Vol 119 (1) ◽  
pp. 221-234 ◽  
Author(s):  
Yuhui Li ◽  
Yong Wang ◽  
He Cui

As a vital skill in an evolving world, interception of moving objects relies on accurate prediction of target motion. In natural circumstances, active gaze shifts often accompany hand movements when exploring targets of interest, but how eye and hand movements are coordinated during manual interception and their dependence on visual prediction remain unclear. Here, we trained gaze-unrestrained monkeys to manually intercept targets appearing at random locations and circularly moving with random speeds. We found that well-trained animals were able to intercept the targets with adequate compensation for both sensory transmission and motor delays. Before interception, the animals' gaze followed the targets with adequate compensation for the sensory delay, but not for extra target displacement during the eye movements. Both hand and eye movements were modulated by target kinematics, and their reaction times were correlated. Moreover, retinal errors and reaching errors were correlated across different stages of reach execution. Our results reveal eye-hand coordination during manual interception, yet the eye and hand movements may show different levels of prediction based on the task context. NEW & NOTEWORTHY Here we studied the eye-hand coordination of monkeys during flexible manual interception of a moving target. Eye movements were untrained and not explicitly associated with reward. We found that the initial saccades toward the moving target adequately compensated for sensory transmission delays, but not for extra target displacement, whereas the reaching arm movements fully compensated for sensorimotor delays, suggesting that the mode of eye-hand coordination strongly depends on behavioral context.


Sign in / Sign up

Export Citation Format

Share Document