scholarly journals Interplay between Primary Cortical Areas and Crossmodal Plasticity

2021 ◽  
Author(s):  
Christian Xerri ◽  
Yoh’i Zennou-Azogui

Perceptual representations are built through multisensory interactions underpinned by dense anatomical and functional neural networks that interconnect primary and associative cortical areas. There is compelling evidence that primary sensory cortical areas do not work in segregation, but play a role in early processes of multisensory integration. In this chapter, we firstly review previous and recent literature showing how multimodal interactions between primary cortices may contribute to refining perceptual representations. Secondly, we discuss findings providing evidence that, following peripheral damage to a sensory system, multimodal integration may promote sensory substitution in deprived cortical areas and favor compensatory plasticity in the spared sensory cortices.

2021 ◽  
Vol 15 ◽  
Author(s):  
Gabrielle Ewall ◽  
Samuel Parkins ◽  
Amy Lin ◽  
Yanis Jaoui ◽  
Hey-Kyoung Lee

Cortical areas are highly interconnected both via cortical and subcortical pathways, and primary sensory cortices are not isolated from this general structure. In primary sensory cortical areas, these pre-existing functional connections serve to provide contextual information for sensory processing and can mediate adaptation when a sensory modality is lost. Cross-modal plasticity in broad terms refers to widespread plasticity across the brain in response to losing a sensory modality, and largely involves two distinct changes: cross-modal recruitment and compensatory plasticity. The former involves recruitment of the deprived sensory area, which includes the deprived primary sensory cortex, for processing the remaining senses. Compensatory plasticity refers to plasticity in the remaining sensory areas, including the spared primary sensory cortices, to enhance the processing of its own sensory inputs. Here, we will summarize potential cellular plasticity mechanisms involved in cross-modal recruitment and compensatory plasticity, and review cortical and subcortical circuits to the primary sensory cortices which can mediate cross-modal plasticity upon loss of vision.


2019 ◽  
Author(s):  
John P McClure ◽  
Pierre-Olivier Polack

Multimodal sensory integration facilitates the generation of a unified and coherent perception of the environment. It is now well established that unimodal sensory perceptions, such as vision, are improved in multisensory contexts. While multimodal integration is primarily performed by dedicated multisensory brain regions such as the association cortices or the superior colliculus, recent studies have shown that multisensory interactions also occur in primary sensory cortices. In particular, sounds were shown to modulate the responses of neurons located in layers 2/3 (L2/3) of the mouse primary visual cortex (V1). Yet, the net effect of sound modulation at the V1 population level remained unclear. Here, we performed two-photon calcium imaging in awake mice to compare the representation of the orientation and the direction of drifting gratings by V1 L2/3 neurons in unimodal (visual only) or multi-modal (audiovisual) conditions. We found that sound modulation depended on the tuning properties (orientation and direction selectivity) and response amplitudes of V1 L2/3 neurons. Sounds potentiated the responses of neurons that were highly tuned to the cue orientation and direction but weakly active in the unimodal context, following the principle of inverse effectiveness of multimodal integration. Moreover, sound suppressed the responses of neurons untuned for the orientation and/or the direction of the visual cue. Altogether, sound modulation improved the representation of the orientation and direction of the visual stimulus in V1 L2/3. Namely, visual stimuli presented with auditory stimuli recruited a neuronal population better tuned to the visual stimulus orientation and direction than when presented alone.


2010 ◽  
Vol 31 (10) ◽  
pp. 1772-1782 ◽  
Author(s):  
Tommi Raij ◽  
Jyrki Ahveninen ◽  
Fa-Hsuan Lin ◽  
Thomas Witzel ◽  
Iiro P. Jääskeläinen ◽  
...  

2012 ◽  
Vol 2012 ◽  
pp. 1-13 ◽  
Author(s):  
M. Alex Meredith ◽  
Brian L. Allman

Numerous investigations of cortical crossmodal plasticity, most often in congenital or early-deaf subjects, have indicated that secondary auditory cortical areas reorganize to exhibit visual responsiveness while the core auditory regions are largely spared. However, a recent study of adult-deafened ferrets demonstrated that core auditory cortex was reorganized by the somatosensory modality. Because adult animals have matured beyond their critical period of sensory development and plasticity, it was not known if adult-deafening and early-deafening would generate the same crossmodal results. The present study used young, ototoxically-lesioned ferrets (n=3) that, after maturation (avg. = 173 days old), showed significant hearing deficits (avg. threshold = 72 dB SPL). Recordings from single-units (n=132) in core auditory cortex showed that 72% were activated by somatosensory stimulation (compared to 1% in hearing controls). In addition, tracer injection into early hearing-impaired core auditory cortex labeled essentially the same auditory cortical and thalamic projection sources as seen for injections in the hearing controls, indicating that the functional reorganization was not the result of new or latent projections to the cortex. These data, along with similar observations from adult-deafened and adult hearing-impaired animals, support the recently proposed brainstem theory for crossmodal plasticity induced by hearing loss.


2015 ◽  
Vol 28 (5-6) ◽  
pp. 559-579 ◽  
Author(s):  
Elisa Raffaella Ferrè ◽  
Patrick Haggard

No unimodal vestibular cortex has been identified in the human brain. Rather, vestibular inputs are strongly integrated with signals from other sensory modalities, such as vision, touch and proprioception. This convergence could reflect an important mechanism for maintaining a perception of the body, including individual body parts, relative to the rest of the environment. Neuroimaging, electrophysiological and psychophysical studies showed evidence for multisensory interactions between vestibular and somatosensory signals. However, no convincing overall theoretical framework has been proposed for vestibular–somatosensory interactions, and it remains unclear whether such percepts are by-products of neural convergence, or a functional multimodal integration. Here we review the current literature on vestibular–multisensory interactions in order to develop a framework for understanding the functions of such multimodal interaction. We propose that the target of vestibular–somatosensory interactions is a form of self-representation.


eLife ◽  
2019 ◽  
Vol 8 ◽  
Author(s):  
Anja Pflug ◽  
Florian Gompf ◽  
Muthuraman Muthuraman ◽  
Sergiu Groppa ◽  
Christian Alexander Kell

Rhythmic actions benefit from synchronization with external events. Auditory-paced finger tapping studies indicate the two cerebral hemispheres preferentially control different rhythms. It is unclear whether left-lateralized processing of faster rhythms and right-lateralized processing of slower rhythms bases upon hemispheric timing differences that arise in the motor or sensory system or whether asymmetry results from lateralized sensorimotor interactions. We measured fMRI and MEG during symmetric finger tapping, in which fast tapping was defined as auditory-motor synchronization at 2.5 Hz. Slow tapping corresponded to tapping to every fourth auditory beat (0.625 Hz). We demonstrate that the left auditory cortex preferentially represents the relative fast rhythm in an amplitude modulation of low beta oscillations while the right auditory cortex additionally represents the internally generated slower rhythm. We show coupling of auditory-motor beta oscillations supports building a metric structure. Our findings reveal a strong contribution of sensory cortices to hemispheric specialization in action control.


Author(s):  
J. Kevin O’regan

Cortical plasticity is often invoked to explain changes in the quality or location of experience observed in rewired animals, in sensory substitution, in extension of the body through tool use, and in the rubber hand illusion. However this appeal to cortical plasticity may be misleading, because it suggests that the cortical areas that are plastic are themselves the loci of generation of experience. This would be an error, I claim, since cortical areas do not generate experience. Cortical areas participate in enabling the interaction of an agent with its environment, and the quality of this interaction constitutes the quality of experience. Thus it is not plasticity in itself, but the change in modes of interaction which plasticity allows, which gives rise to the change of experience observed in these studies.


2021 ◽  
pp. 1-16
Author(s):  
Heejung Jung ◽  
Tor D. Wager ◽  
R. McKell Carter

Abstract Functions in higher-order brain regions are the source of extensive debate. Although past trends have been to describe the brain—especially posterior cortical areas—in terms of a set of functional modules, a new emerging paradigm focuses on the integration of proximal functions. In this review, we synthesize emerging evidence that a variety of novel functions in the higher-order brain regions are due to convergence: convergence of macroscale gradients brings feature-rich representations into close proximity, presenting an opportunity for novel functions to arise. Using the TPJ as an example, we demonstrate that convergence is enabled via three properties of the brain: (1) hierarchical organization, (2) abstraction, and (3) equidistance. As gradients travel from primary sensory cortices to higher-order brain regions, information becomes abstracted and hierarchical, and eventually, gradients meet at a point maximally and equally distant from their sensory origins. This convergence, which produces multifaceted combinations, such as mentalizing another person's thought or projecting into a future space, parallels evolutionary and developmental characteristics in such regions, resulting in new cognitive and affective faculties.


2015 ◽  
Author(s):  
Shern Shiou Tan ◽  
Tomas Maul ◽  
Neil Mennie

Loss of vision is a severe impairment to the dominant sensory system. It often has a catastrophic effect upon the sufferer, with knock-on effects to their standard of living, their ability to support themselves, and their care-givers lives. Research into visual impairments is multi-faceted, focusing on the causes of these debilitating conditions as well as attempting to alleviate the daily lives of affected individuals. One of the methods is through the usage of sensory substitution device. Our proposed system, Luminophonics, focuses on visual to auditory cross modalities information conversions. A visual to audio sensory substitution device a type of system that obtains a continual stream visual inputs which it converts into corresponding auditory soundscape. Ultimately, this device allows the visually impaired to visualize the surrounding environment by only listening to the generated soundscape. Even though there is a huge potential for this kind of devices, public usage is still minimal (Loomis, 2010). In order to promote the adoption from the visually impaired, the overall performance of these devices need to be improved in terms of soundscape interpretability, information preservation and listening comfort amongst other factors. Luminophonics has developed 3 type of prototypes, which we have used to explore different ideas pertaining to visual to audio sensory substitution. In addition to these, one of the prototypes has been converted to include depth information using time of flight camera. Previously, an automated measurement method is used to evaluate the performance of the 3 prototypes (Tan, 2013). The results of the measurement cover the effectiveness in terms of interpretability and information preservation. The main purpose of the experiment reported herein, was to test the prototypes on human subjects in order to gain greater insight on how they perform in real-life situations.


Sign in / Sign up

Export Citation Format

Share Document