visuomotor transformation
Recently Published Documents


TOTAL DOCUMENTS

52
(FIVE YEARS 1)

H-INDEX

19
(FIVE YEARS 0)

IBRO Reports ◽  
2019 ◽  
Vol 6 ◽  
pp. S194-S195
Author(s):  
Yusuke Yamamoto ◽  
Reona Yamaguchi ◽  
Tomohiko Takei ◽  
Chao Zenas ◽  
Tadashi Isa

Neuron ◽  
2018 ◽  
Vol 100 (6) ◽  
pp. 1429-1445.e4 ◽  
Author(s):  
Thomas O. Helmbrecht ◽  
Marco dal Maschio ◽  
Joseph C. Donovan ◽  
Styliani Koutsouli ◽  
Herwig Baier

2017 ◽  
Vol 7 (1) ◽  
Author(s):  
Yusuke Fujiwara ◽  
Jongho Lee ◽  
Takahiro Ishikawa ◽  
Shinji Kakei ◽  
Jun Izawa

Neuron ◽  
2017 ◽  
Vol 96 (4) ◽  
pp. 910-924.e5 ◽  
Author(s):  
Juan Pérez-Fernández ◽  
Andreas A. Kardamakis ◽  
Daichi G. Suzuki ◽  
Brita Robertson ◽  
Sten Grillner

2017 ◽  
Vol 114 (24) ◽  
pp. 6370-6375 ◽  
Author(s):  
Naveen Sendhilnathan ◽  
Debaleena Basu ◽  
Aditya Murthy

The frontal eye field (FEF) is a key brain region to study visuomotor transformations because the primary input to FEF is visual in nature, whereas its output reflects the planning of behaviorally relevant saccadic eye movements. In this study, we used a memory-guided saccade task to temporally dissociate the visual epoch from the saccadic epoch through a delay epoch, and used the local field potential (LFP) along with simultaneously recorded spike data to study the visuomotor transformation process. We showed that visual latency of the LFP preceded spiking activity in the visual epoch, whereas spiking activity preceded LFP activity in the saccade epoch. We also found a spatially tuned elevation in gamma band activity (30–70 Hz), but not in the corresponding spiking activity, only during the delay epoch, whose activity predicted saccade reaction times and the cells’ saccade tuning. In contrast, beta band activity (13–30 Hz) showed a nonspatially selective suppression during the saccade epoch. Taken together, these results suggest that motor plans leading to saccades may be generated internally within the FEF from local activity represented by gamma activity.


Neuron ◽  
2016 ◽  
Vol 89 (3) ◽  
pp. 598-612 ◽  
Author(s):  
Yuanyuan Yao ◽  
Xiaoquan Li ◽  
Baibing Zhang ◽  
Chen Yin ◽  
Yafeng Liu ◽  
...  

2015 ◽  
Vol 113 (5) ◽  
pp. 1377-1399 ◽  
Author(s):  
T. Scott Murdison ◽  
Guillaume Leclercq ◽  
Philippe Lefèvre ◽  
Gunnar Blohm

Smooth pursuit eye movements are driven by retinal motion and enable us to view moving targets with high acuity. Complicating the generation of these movements is the fact that different eye and head rotations can produce different retinal stimuli but giving rise to identical smooth pursuit trajectories. However, because our eyes accurately pursue targets regardless of eye and head orientation (Blohm G, Lefèvre P. J Neurophysiol 104: 2103–2115, 2010), the brain must somehow take these signals into account. To learn about the neural mechanisms potentially underlying this visual-to-motor transformation, we trained a physiologically inspired neural network model to combine two-dimensional (2D) retinal motion signals with three-dimensional (3D) eye and head orientation and velocity signals to generate a spatially correct 3D pursuit command. We then simulated conditions of 1) head roll-induced ocular counterroll, 2) oblique gaze-induced retinal rotations, 3) eccentric gazes (invoking the half-angle rule), and 4) optokinetic nystagmus to investigate how units in the intermediate layers of the network accounted for different 3D constraints. Simultaneously, we simulated electrophysiological recordings (visual and motor tunings) and microstimulation experiments to quantify the reference frames of signals at each processing stage. We found a gradual retinal-to-intermediate-to-spatial feedforward transformation through the hidden layers. Our model is the first to describe the general 3D transformation for smooth pursuit mediated by eye- and head-dependent gain modulation. Based on several testable experimental predictions, our model provides a mechanism by which the brain could perform the 3D visuomotor transformation for smooth pursuit.


Sign in / Sign up

Export Citation Format

Share Document