scholarly journals Compensating for a shifting world: evolving reference frames of visual and auditory signals across three multimodal brain areas

2019 ◽  
Author(s):  
Valeria C. Caruso ◽  
Daniel S. Pages ◽  
Marc A. Sommer ◽  
Jennifer M. Groh

ABSTRACTStimulus locations are detected differently by different sensory systems, but ultimately they yield similar percepts and behavioral responses. How the brain transcends initial differences to compute similar codes is unclear. We quantitatively compared the reference frames of two sensory modalities, vision and audition, across three interconnected brain areas involved in generating saccades, namely the frontal eye fields (FEF), lateral and medial parietal cortex (M/LIP), and superior colliculus (SC). We recorded from single neurons in head-restrained monkeys performing auditory- and visually-guided saccades from variable initial fixation locations, and evaluated whether their receptive fields were better described as eye-centered, head-centered, or hybrid (i.e. not anchored uniquely to head- or eye-orientation). We found a progression of reference frames across areas and across time, with considerable hybrid-ness and persistent differences between modalities during most epochs/brain regions. For both modalities, the SC was more eye-centered than the FEF, which in turn was more eye-centered than the predominantly hybrid M/LIP. In all three areas and temporal epochs from stimulus onset to movement, visual signals were more eye-centered than auditory signals. In the SC and FEF, auditory signals became more eye-centered at the time of the saccade than they were initially after stimulus onset, but only in the SC at the time of the saccade did the auditory signals become predominantly eye-centered. The results indicate that visual and auditory signals both undergo transformations, ultimately reaching the same final reference frame but via different dynamics across brain regions and time.New and NoteworthyModels for visual-auditory integration posit that visual signals are eye-centered throughout the brain, while auditory signals are converted from head-centered to eye-centered coordinates. We show instead that both modalities largely employ hybrid reference frames: neither fully head-nor eye-centered. Across three hubs of the oculomotor network (Intraparietal Cortex, Frontal Eye Field and Superior Colliculus) visual and auditory signals evolve from hybrid to a common eye-centered format via different dynamics across brain areas and time.

Author(s):  
Valeria C Caruso ◽  
Daniel S Pages ◽  
Marc A. Sommer ◽  
Jennifer M Groh

Stimulus locations are detected differently by different sensory systems, but ultimately they yield similar percepts and behavioral responses. How the brain transcends initial differences to compute similar codes is unclear. We quantitatively compared the reference frames of two sensory modalities, vision and audition, across three interconnected brain areas involved in generating saccades, namely the frontal eye fields (FEF), lateral and medial parietal cortex (M/LIP), and superior colliculus (SC). We recorded from single neurons in head-restrained monkeys performing auditory- and visually-guided saccades from variable initial fixation locations, and evaluated whether their receptive fields were better described as eye-centered, head-centered, or hybrid (i.e. not anchored uniquely to head- or eye-orientation). We found a progression of reference frames across areas and across time, with considerable hybrid-ness and persistent differences between modalities during most epochs/brain regions. For both modalities, the SC was more eye-centered than the FEF, which in turn was more eye-centered than the predominantly hybrid M/LIP. In all three areas and temporal epochs from stimulus onset to movement, visual signals were more eye-centered than auditory signals. In the SC and FEF, auditory signals became more eye-centered at the time of the saccade than they were initially after stimulus onset, but only in the SC at the time of the saccade did the auditory signals become predominantly eye-centered. The results indicate that visual and auditory signals both undergo transformations, ultimately reaching the same final reference frame but via different dynamics across brain regions and time.


2011 ◽  
Vol 106 (4) ◽  
pp. 1862-1874 ◽  
Author(s):  
Jan Churan ◽  
Daniel Guitton ◽  
Christopher C. Pack

Our perception of the positions of objects in our surroundings is surprisingly unaffected by movements of the eyes, head, and body. This suggests that the brain has a mechanism for maintaining perceptual stability, based either on the spatial relationships among visible objects or internal copies of its own motor commands. Strong evidence for the latter mechanism comes from the remapping of visual receptive fields that occurs around the time of a saccade. Remapping occurs when a single neuron responds to visual stimuli placed presaccadically in the spatial location that will be occupied by its receptive field after the completion of a saccade. Although evidence for remapping has been found in many brain areas, relatively little is known about how it interacts with sensory context. This interaction is important for understanding perceptual stability more generally, as the brain may rely on extraretinal signals or visual signals to different degrees in different contexts. Here, we have studied the interaction between visual stimulation and remapping by recording from single neurons in the superior colliculus of the macaque monkey, using several different visual stimulus conditions. We find that remapping responses are highly sensitive to low-level visual signals, with the overall luminance of the visual background exerting a particularly powerful influence. Specifically, although remapping was fairly common in complete darkness, such responses were usually decreased or abolished in the presence of modest background illumination. Thus the brain might make use of a strategy that emphasizes visual landmarks over extraretinal signals whenever the former are available.


1998 ◽  
Vol 17 (3) ◽  
pp. 157-162 ◽  
Author(s):  
Maxine C Lintern ◽  
Janet R Wetherell ◽  
Margaret E Smith

1 In brain areas of untreated guinea-pigs the highest activity of acetylcholinesterase was seen in the striatum and cerebellum, followed by the midbrain, medulla-pons and cortex, and the lowest in the hippocampus. The activity in diaphragm was sevenfold lower than in the hippocampus. 2 At 1 h after soman (27 mg/kg) administration the activity of the enzyme was dramatically reduced in all tissues studied. In muscle the three major molecular forms (A12, G4 and G1) showed a similar degree of inhibition and a similar rate of recovery and the activity had returned to normal by 7 days. 3 In the brain soman inhibited the G4 form more than the G1 form. The hippocampus, cortex and midbrain showed the greatest reductions in enzyme activity. At 7 days the activity in the cortex, medulla pons and striatum had recovered but in the hippocampus, midbrain and cerebellum it was still inhibited. 4 Thus the effects of soman administration varied in severity and time course in the different tissues studied. However the enzyme activity was still reduced in all tissues at 24 h when the overt signs of poisoning had disappeared.


2019 ◽  
Vol 33 (1) ◽  
pp. 30-36 ◽  
Author(s):  
Victor Schmidbauer ◽  
Silvia Bonelli

AbstractEpilepsy is frequently accompanied by severe cognitive side effects. Temporal lobe epilepsy (TLE), and even successful surgical treatment, may affect cognitive function, in particular language as well as verbal and visual memory function. Epilepsy arising from the temporal lobe can be controlled surgically in up to 70% of patients. The goals of epilepsy surgery are to remove the brain areas generating the seizures without causing or aggravating neuropsychological deficits. This requires accurate localization of the brain areas generating the seizures (“epileptogenic zone”) and the areas responsible for motor and cognitive functions, such as language and memory (“essential brain regions”) during presurgical evaluation. In the past decades, functional magnetic resonance imaging (fMRI) has been increasingly used to noninvasively lateralize and localize not only primary motor and somatosensory areas, but also brain areas that are involved in everyday language and memory processes. The imaging modality also shows potential for predicting the effects of temporal lobe resection on language and memory function. Together with other MRI modalities, cognitive fMRI is a promising tool to improve surgical strategies tailored to individual patients with regard to functional outcome, by virtue of definition of epileptic cerebral areas that need to be resected and eloquent areas that need to be spared.The aim of this review is to provide an overview of recent developments and practical recommendations for the clinical use of cognitive fMRI in TLE.


2019 ◽  
Vol 11 (2) ◽  
pp. 98
Author(s):  
Artur Jaschke

Music activates a wide array of brain areas involved in different functions such as   perception, processing and execution of music. Understanding musical processes in the brain has multiple implications in the neuro- and health sciences.  Challenging the brain with a multisensory stimulus such as music activates responses beyond the auditory cortex of the temporal lobe. Other areas that are involved include the frontal lobes, parietal lobes, areas of the limbic system such as the amygdala, hippocampus and thalamus, the cerebellum and the brainstem. Nonetheless, there has been no attempt to summarize all involved brain areas in music into one overall encompassing map. This may well be, as there has been no thorough theory introduced, which would allow an initial point of departure in creating such a mapTherefore, a thorough systematic review has been conducted to identify all mentioned neural connections involved in the perception, processing and execution of music.  Communication between the thalamic nuclei is the initial step in multisensory integration, which lies at the base of the neural networks as proposed in this paper. Against this background, this manuscript introduces the to our knowledge first map of all brain regions involved in the perception, processing and execution of music.Consequently, placing thalamic multisensory integration at the core of this atlas allowed us to create a preliminary theory to explain the complexity of music induced brain activation.


2004 ◽  
Vol 91 (3) ◽  
pp. 1381-1402 ◽  
Author(s):  
Marc A. Sommer ◽  
Robert H. Wurtz

Neuronal processing in cerebral cortex and signal transmission from cortex to brain stem have been studied extensively, but little is known about the numerous feedback pathways that ascend from brain stem to cortex. In this study, we characterized the signals conveyed through an ascending pathway coursing from the superior colliculus (SC) to the frontal eye field (FEF) via mediodorsal thalamus (MD). Using antidromic and orthodromic stimulation, we identified SC source neurons, MD relay neurons, and FEF recipient neurons of the pathway in Macaca mulatta. The monkeys performed oculomotor tasks, including delayed-saccade tasks, that permitted analysis of signals such as visual activity, delay activity, and presaccadic activity. We found that the SC sends all of these signals into the pathway with no output selectivity, i.e., the signals leaving the SC resembled those found generally within the SC. Visual activity arrived in FEF too late to contribute to short-latency visual responses there, and delay activity was largely filtered out in MD. Presaccadic activity, however, seemed critical because it traveled essentially unchanged from SC to FEF. Signal transmission in the pathway was fast (∼2 ms from SC to FEF) and topographically organized (SC neurons drove MD and FEF neurons having similarly eccentric visual and movement fields). Our analysis of identified neurons in one pathway from brain stem to frontal cortex thus demonstrates that multiple signals are sent from SC to FEF with presaccadic activity being prominent. We hypothesize that a major signal conveyed by the pathway is corollary discharge information about the vector of impending saccades.


2020 ◽  
Author(s):  
Xue Luo ◽  
Danrui Cai ◽  
Kejiong Shen ◽  
Qinqin Deng ◽  
Xinlan Lei ◽  
...  

AbstractThe looming stimulus-evoked flight response is an experimental paradigm for studying innate defensive behaviors. However, how the visual looming stimulus is transmitted from the retina to the brain remains poorly understood. Here, we report that superior colliculus (SC)-projecting RGCs transmit the looming signal from the retina to the brain to mediate the looming-evoked flight behavior by releasing GABA. In the mouse retina, GABAergic RGCs are capable of projecting to many brain areas, including the SC. Superior colliculus (SC)-projecting GABAergic RGCs (spgRGCs) are mono-synaptically connected to the parvalbumin-positive SC neurons known to be required for the looming-evoked flight response. Optogenetic activation of spgRGCs triggers GABA-mediated inhibition in SC neurons. The ablation or silence of spgRGCs compromises looming-evoked flight response but not image-forming functions. Therefore, this study shows that spgRGCs control the looming-evoked flight response by regulating SC neurons via GABA, providing novel insight into the regulation of innate defensive behaviors.


2021 ◽  
Vol 9 ◽  
Author(s):  
AnnaCarolina Garza ◽  
Alice Aizza ◽  
Janchira K. Charoenworawat ◽  
Jessica A. Church

Your brain is always adjusting to the changing swirl of activities and interactions you have every day. Every time you accomplish a goal, you are exercising what are called the brain’s executive functions. These skills include resisting impulses, switching between tasks, and updating information in your memory. We asked whether these different skills relied on the same brain areas, and whether young people used the same brain areas as adults. We took pictures of kids’ and teens’ brains to see which areas of the brain they were using while they played three simple games related to these executive functions. We found that youth used similar brain regions to adults while playing the three games, and that many parts of the brain were used across all three games. These results help us understand how kids use their brains to be successful and how these skills develop.


2017 ◽  
Author(s):  
V. C. Caruso ◽  
D. S. Pages ◽  
M. A. Sommer ◽  
J. M. Groh

ABSTRACTWe accurately perceive the visual scene despite moving our eyes ~3 times per second, an ability that requires incorporation of eye position and retinal information. We assessed how this neural computation unfolds across three interconnected structures: frontal eye fields (FEF), intraparietal cortex (LIP/MIP), and the superior colliculus (SC). Single unit activity was assessed in head-restrained monkeys performing visually-guided saccades from different initial fixations. As previously shown, the receptive fields of most LIP/MIP neurons shifted to novel positions on the retina for each eye position, and these locations were not clearly related to each other in either eye- or head-centered coordinates (hybrid coordinates). In contrast, the receptive fields of most SC neurons were stable in eye-centered coordinates. In FEF, visual signals were intermediate between those patterns: around 60% were eye-centered, whereas the remainder showed changes in receptive field location, boundaries, or responsiveness that rendered the response patterns hybrid or occasionally head-centered. These results suggest that FEF may act as a transitional step in an evolution of coordinates between LIP/MIP and SC. The persistence across cortical areas of hybrid representations that do not provide unequivocal location labels in a consistent reference frame has implications for how these representations must be read-out.New & NoteworthyHow we perceive the world as stable using mobile retinas is poorly understood. We compared the stability of visual receptive fields across different fixation positions in three visuomotor regions. Irregular changes in receptive field position were ubiquitous in intraparietal cortex, evident but less common in the frontal eye fields, and negligible in the superior colliculus (SC), where receptive fields shifted reliably across fixations. Only the SC provides a stable labelled-line code for stimuli across saccades.


Sign in / Sign up

Export Citation Format

Share Document