scholarly journals Early Blindness Results in Developmental Plasticity for Auditory Motion Processing within Auditory and Occipital Cortex

Author(s):  
Fang Jiang ◽  
G. Christopher Stecker ◽  
Geoffrey M. Boynton ◽  
Ione Fine
2014 ◽  
Vol 14 (13) ◽  
pp. 4-4 ◽  
Author(s):  
F. Jiang ◽  
G. C. Stecker ◽  
I. Fine

2007 ◽  
Vol 45 (3) ◽  
pp. 523-530 ◽  
Author(s):  
A. Brooks ◽  
R. van der Zwan ◽  
A. Billard ◽  
B. Petreska ◽  
S. Clarke ◽  
...  

2014 ◽  
Vol 111 (1) ◽  
pp. 112-127 ◽  
Author(s):  
L. Thaler ◽  
J. L. Milne ◽  
S. R. Arnott ◽  
D. Kish ◽  
M. A. Goodale

We have shown in previous research (Thaler L, Arnott SR, Goodale MA. PLoS One 6: e20162, 2011) that motion processing through echolocation activates temporal-occipital cortex in blind echolocation experts. Here we investigated how neural substrates of echo-motion are related to neural substrates of auditory source-motion and visual-motion. Three blind echolocation experts and twelve sighted echolocation novices underwent functional MRI scanning while they listened to binaural recordings of moving or stationary echolocation or auditory source sounds located either in left or right space. Sighted participants' brain activity was also measured while they viewed moving or stationary visual stimuli. For each of the three modalities separately (echo, source, vision), we then identified motion-sensitive areas in temporal-occipital cortex and in the planum temporale. We then used a region of interest (ROI) analysis to investigate cross-modal responses, as well as laterality effects. In both sighted novices and blind experts, we found that temporal-occipital source-motion ROIs did not respond to echo-motion, and echo-motion ROIs did not respond to source-motion. This double-dissociation was absent in planum temporale ROIs. Furthermore, temporal-occipital echo-motion ROIs in blind, but not sighted, participants showed evidence for contralateral motion preference. Temporal-occipital source-motion ROIs did not show evidence for contralateral preference in either blind or sighted participants. Our data suggest a functional segregation of processing of auditory source-motion and echo-motion in human temporal-occipital cortex. Furthermore, the data suggest that the echo-motion response in blind experts may represent a reorganization rather than exaggeration of response observed in sighted novices. There is the possibility that this reorganization involves the recruitment of “visual” cortical areas.


2013 ◽  
Vol 109 (2) ◽  
pp. 321-331 ◽  
Author(s):  
David A. Magezi ◽  
Karin A. Buetler ◽  
Leila Chouiter ◽  
Jean-Marie Annoni ◽  
Lucas Spierer

Following prolonged exposure to adaptor sounds moving in a single direction, participants may perceive stationary-probe sounds as moving in the opposite direction [direction-selective auditory motion aftereffect (aMAE)] and be less sensitive to motion of any probe sounds that are actually moving (motion-sensitive aMAE). The neural mechanisms of aMAEs, and notably whether they are due to adaptation of direction-selective motion detectors, as found in vision, is presently unknown and would provide critical insight into auditory motion processing. We measured human behavioral responses and auditory evoked potentials to probe sounds following four types of moving-adaptor sounds: leftward and rightward unidirectional, bidirectional, and stationary. Behavioral data replicated both direction-selective and motion-sensitive aMAEs. Electrical neuroimaging analyses of auditory evoked potentials to stationary probes revealed no significant difference in either global field power (GFP) or scalp topography between leftward and rightward conditions, suggesting that aMAEs are not based on adaptation of direction-selective motion detectors. By contrast, the bidirectional and stationary conditions differed significantly in the stationary-probe GFP at 200 ms poststimulus onset without concomitant topographic modulation, indicative of a difference in the response strength between statistically indistinguishable intracranial generators. The magnitude of this GFP difference was positively correlated with the magnitude of the motion-sensitive aMAE, supporting the functional relevance of the neurophysiological measures. Electrical source estimations revealed that the GFP difference followed from a modulation of activity in predominantly right hemisphere frontal-temporal-parietal brain regions previously implicated in auditory motion processing. Our collective results suggest that auditory motion processing relies on motion-sensitive, but, in contrast to vision, non-direction-selective mechanisms.


2014 ◽  
Vol 40 (3) ◽  
pp. 265-272 ◽  
Author(s):  
L. B. Shestopalova ◽  
E. A. Petropavlovskaia ◽  
S. Ph. Vaitulevich ◽  
N. I. Nikitin

2018 ◽  
Author(s):  
Ceren Battal ◽  
Mohamed Rezk ◽  
Stefania Mattioni ◽  
Jyothirmayi Vadlamudi ◽  
Olivier Collignon

ABSTRACTThe ability to compute the location and direction of sounds is a crucial perceptual skill to efficiently interact with dynamic environments. How the human brain implements spatial hearing is however poorly understood. In our study, we used fMRI to characterize the brain activity of male and female humans listening to left, right, up and down moving as well as static sounds. Whole brain univariate results contrasting moving and static sounds varying in their location revealed a robust functional preference for auditory motion in bilateral human Planum Temporale (hPT). Using independently localized hPT, we show that this region contains information about auditory motion directions and, to a lesser extent, sound source locations. Moreover, hPT showed an axis of motion organization reminiscent of the functional organization of the middle-temporal cortex (hMT+/V5) for vision. Importantly, whereas motion direction and location rely on partially shared pattern geometries in hPT, as demonstrated by successful cross-condition decoding, the responses elicited by static and moving sounds were however significantly distinct. Altogether our results demonstrate that the hPT codes for auditory motion and location but that the underlying neural computation linked to motion processing is more reliable and partially distinct from the one supporting sound source location.SIGNIFICANCE STATEMENTIn comparison to what we know about visual motion, little is known about how the brain implements spatial hearing. Our study reveals that motion directions and sound source locations can be reliably decoded in the human Planum Temporale (hPT) and that they rely on partially shared pattern geometries. Our study therefore sheds important new lights on how computing the location or direction of sounds are implemented in the human auditory cortex by showing that those two computations rely on partially shared neural codes. Furthermore, our results show that the neural representation of moving sounds in hPT follows a “preferred axis of motion” organization, reminiscent of the coding mechanisms typically observed in the occipital hMT+/V5 region for computing visual motion.


Author(s):  
Alexandra A. Ludwig ◽  
Rudolf Rübsamen ◽  
Gerd J. Dörrscheidt ◽  
Sonja A. Kotz

2010 ◽  
Vol 77 (3) ◽  
pp. 328-329
Author(s):  
L.B. Shestopalova ◽  
E.A. Petropavlovskaia ◽  
S.Ph. Vaitulevich ◽  
Y.A. Vasilenko

2019 ◽  
Vol 116 (20) ◽  
pp. 10081-10086 ◽  
Author(s):  
Elizabeth Huber ◽  
Fang Jiang ◽  
Ione Fine

Previous studies report that human middle temporal complex (hMT+) is sensitive to auditory motion in early-blind individuals. Here, we show that hMT+ also develops selectivity for auditory frequency after early blindness, and that this selectivity is maintained after sight recovery in adulthood. Frequency selectivity was assessed using both moving band-pass and stationary pure-tone stimuli. As expected, within primary auditory cortex, both moving and stationary stimuli successfully elicited frequency-selective responses, organized in a tonotopic map, for all subjects. In early-blind and sight-recovery subjects, we saw evidence for frequency selectivity within hMT+ for the auditory stimulus that contained motion. We did not find frequency-tuned responses within hMT+ when using the stationary stimulus in either early-blind or sight-recovery subjects. We saw no evidence for auditory frequency selectivity in hMT+ in sighted subjects using either stimulus. Thus, after early blindness, hMT+ can exhibit selectivity for auditory frequency. Remarkably, this auditory frequency tuning persists in two adult sight-recovery subjects, showing that, in these subjects, auditory frequency-tuned responses can coexist with visually driven responses in hMT+.


Sign in / Sign up

Export Citation Format

Share Document