Mechanisms of Sound Localization in Mammals

2010 ◽  
Vol 90 (3) ◽  
pp. 983-1012 ◽  
Author(s):  
Benedikt Grothe ◽  
Michael Pecka ◽  
David McAlpine

The ability to determine the location of a sound source is fundamental to hearing. However, auditory space is not represented in any systematic manner on the basilar membrane of the cochlea, the sensory surface of the receptor organ for hearing. Understanding the means by which sensitivity to spatial cues is computed in central neurons can therefore contribute to our understanding of the basic nature of complex neural representations. We review recent evidence concerning the nature of the neural representation of auditory space in the mammalian brain and elaborate on recent advances in the understanding of mammalian subcortical processing of auditory spatial cues that challenge the “textbook” version of sound localization, in particular brain mechanisms contributing to binaural hearing.

2008 ◽  
Vol 364 (1515) ◽  
pp. 331-339 ◽  
Author(s):  
Andrew J King

The visual and auditory systems frequently work together to facilitate the identification and localization of objects and events in the external world. Experience plays a critical role in establishing and maintaining congruent visual–auditory associations, so that the different sensory cues associated with targets that can be both seen and heard are synthesized appropriately. For stimulus location, visual information is normally more accurate and reliable and provides a reference for calibrating the perception of auditory space. During development, vision plays a key role in aligning neural representations of space in the brain, as revealed by the dramatic changes produced in auditory responses when visual inputs are altered, and is used throughout life to resolve short-term spatial conflicts between these modalities. However, accurate, and even supra-normal, auditory localization abilities can be achieved in the absence of vision, and the capacity of the mature brain to relearn to localize sound in the presence of substantially altered auditory spatial cues does not require visuomotor feedback. Thus, while vision is normally used to coordinate information across the senses, the neural circuits responsible for spatial hearing can be recalibrated in a vision-independent fashion. Nevertheless, early multisensory experience appears to be crucial for the emergence of an ability to match signals from different sensory modalities and therefore for the outcome of audiovisual-based rehabilitation of deaf patients in whom hearing has been restored by cochlear implantation.


2006 ◽  
Vol 95 (2) ◽  
pp. 783-790 ◽  
Author(s):  
María Lucía Pérez ◽  
José Luis Peña

Spatial receptive fields of neurons in the auditory pathway of the barn owl result from the sensitivity to combinations of interaural time (ITD) and level differences across stimulus frequency. Both the forebrain and tectum of the owl contain such neurons. The neural pathways, which lead to the forebrain and tectal representations of auditory space, separate before the midbrain map of auditory space is synthesized. The first nuclei that belong exclusively to either the forebrain or the tectal pathways are the nucleus ovoidalis (Ov) and the external nucleus of the inferior colliculus (ICx), respectively. Both receive projections from the lateral shell subdivision of the inferior colliculus but are not interconnected. Previous studies indicate that the owl's tectal representation of auditory space is different from those found in the owl's forebrain and the mammalian brain. We addressed the question of whether the computation of spatial cues in both pathways is the same by comparing the ITD tuning of Ov and ICx neurons. Unlike in ICx, the relationship between frequency and ITD tuning had not been studied in single Ov units. In contrast to the conspicuous frequency independent ITD tuning of space-specific neurons of ICx, ITD selectivity varied with frequency in Ov. We also observed that the spatially tuned neurons of Ov respond to lower frequencies and are more broadly tuned to ITD than in ICx. Thus there are differences in the integration of frequency and ITD in the two sound-localization pathways. Thalamic neurons integrate spatial information not only within a broader frequency band but also across ITD channels.


2008 ◽  
Vol 20 (3) ◽  
pp. 603-635 ◽  
Author(s):  
Murat Aytekin ◽  
Cynthia F. Moss ◽  
Jonathan Z. Simon

Sound localization is known to be a complex phenomenon, combining multisensory information processing, experience-dependent plasticity, and movement. Here we present a sensorimotor model that addresses the question of how an organism could learn to localize sound sources without any a priori neural representation of its head-related transfer function or prior experience with auditory spatial information. We demonstrate quantitatively that the experience of the sensory consequences of its voluntary motor actions allows an organism to learn the spatial location of any sound source. Using examples from humans and echolocating bats, our model shows that a naive organism can learn the auditory space based solely on acoustic inputs and their relation to motor states.


2020 ◽  
pp. 258-296
Author(s):  
Gualtiero Piccinini

Neural representations are models of the organism and environment built by the nervous system. This chapter provides an account of representational role and content for both indicative and imperative representations. It also argues that, contrary to a mainstream assumption, representations are not merely theoretical posits. Instead, neural representations are observable and are routinely observed and manipulated by experimental neuroscientists in their laboratories. If a type of entity is observable or manipulable, then it exists. Therefore, neural representations are as real as neurons, action potentials, or any other experimentally established entities in our ontology.


2000 ◽  
Vol 83 (4) ◽  
pp. 2300-2314 ◽  
Author(s):  
U. Koch ◽  
B. Grothe

To date, most physiological studies that investigated binaural auditory processing have addressed the topic rather exclusively in the context of sound localization. However, there is strong psychophysical evidence that binaural processing serves more than only sound localization. This raises the question of how binaural processing of spatial cues interacts with cues important for feature detection. The temporal structure of a sound is one such feature important for sound recognition. As a first approach, we investigated the influence of binaural cues on temporal processing in the mammalian auditory system. Here, we present evidence that binaural cues, namely interaural intensity differences (IIDs), have profound effects on filter properties for stimulus periodicity of auditory midbrain neurons in the echolocating big brown bat, Eptesicus fuscus. Our data indicate that these effects are partially due to changes in strength and timing of binaural inhibitory inputs. We measured filter characteristics for the periodicity (modulation frequency) of sinusoidally frequency modulated sounds (SFM) under different binaural conditions. As criteria, we used 50% filter cutoff frequencies of modulation transfer functions based on discharge rate as well as synchronicity of discharge to the sound envelope. The binaural conditions were contralateral stimulation only, equal stimulation at both ears (IID = 0 dB), and more intense at the ipsilateral ear (IID = −20, −30 dB). In 32% of neurons, the range of modulation frequencies the neurons responded to changed considerably comparing monaural and binaural (IID =0) stimulation. Moreover, in ∼50% of neurons the range of modulation frequencies was narrower when the ipsilateral ear was favored (IID = −20) compared with equal stimulation at both ears (IID = 0). In ∼10% of the neurons synchronization differed when comparing different binaural cues. Blockade of the GABAergic or glycinergic inputs to the cells recorded from revealed that inhibitory inputs were at least partially responsible for the observed changes in SFM filtering. In 25% of the neurons, drug application abolished those changes. Experiments using electronically introduced interaural time differences showed that the strength of ipsilaterally evoked inhibition increased with increasing modulation frequencies in one third of the cells tested. Thus glycinergic and GABAergic inhibition is at least one source responsible for the observed interdependence of temporal structure of a sound and spatial cues.


Neuron ◽  
2009 ◽  
Vol 62 (1) ◽  
pp. 123-134 ◽  
Author(s):  
Sasha Devore ◽  
Antje Ihlefeld ◽  
Kenneth Hancock ◽  
Barbara Shinn-Cunningham ◽  
Bertrand Delgutte

2000 ◽  
Vol 83 (5) ◽  
pp. 2723-2739 ◽  
Author(s):  
Gregg H. Recanzone ◽  
Darren C. Guard ◽  
Mimi L. Phan ◽  
Tien-I K. Su

Lesion studies have indicated that the auditory cortex is crucial for the perception of acoustic space, yet it remains unclear how these neurons participate in this perception. To investigate this, we studied the responses of single neurons in the primary auditory cortex (AI) and the caudomedial field (CM) of two monkeys while they performed a sound-localization task. Regression analysis indicated that the responses of ∼80% of neurons in both cortical areas were significantly correlated with the azimuth or elevation of the stimulus, or both, which we term “spatially sensitive.” The proportion of spatially sensitive neurons was greater for stimulus azimuth compared with stimulus elevation, and elevation sensitivity was primarily restricted to neurons that were tested using stimuli that the monkeys also could localize in elevation. Most neurons responded best to contralateral speaker locations, but we also encountered neurons that responded best to ipsilateral locations and neurons that had their greatest responses restricted to a circumscribed region within the central 60° of frontal space. Comparing the spatially sensitive neurons with those that were not spatially sensitive indicated that these two populations could not be distinguished based on either the firing rate, the rate/level functions, or on their topographic location within AI. Direct comparisons between the responses of individual neurons and the behaviorally measured sound-localization ability indicated that proportionally more neurons in CM had spatial sensitivity that was consistent with the behavioral performance compared with AI neurons. Pooling the responses across neurons strengthened the relationship between the neuronal and psychophysical data and indicated that the responses pooled across relatively few CM neurons contain enough information to account for sound-localization ability. These data support the hypothesis that auditory space is processed in a serial manner from AI to CM in the primate cerebral cortex.


2020 ◽  
Author(s):  
David Badre ◽  
Apoorva Bhandari ◽  
Haley Keglovits ◽  
Atsushi Kikumoto

Cognitive control allows us to think and behave flexibly based on our context and goals. At the heart of theories of cognitive control is a control representation that enables the same input to produce different outputs contingent on contextual factors. In this review, we focus on an important property of the control representation’s neural code: its representational dimensionality. Dimensionality of a neural representation balances a basic separability/generalizability trade-off in neural computation. We will discuss the implications of this trade-off for cognitive control. We will then briefly review current neuroscience findings regarding the dimensionality of control representations in the brain, particularly the prefrontal cortex. We conclude by highlighting open questions and crucial directions for future research.


Sign in / Sign up

Export Citation Format

Share Document