scholarly journals Cortical auditory distance representation based on direct-to-reverberant energy ratio

2019 ◽  
Author(s):  
Norbert Kopco ◽  
Keerthi Kumar Doreswamy ◽  
Samantha Huang ◽  
Stephanie Rossi ◽  
Jyrki Ahveninen

AbstractAuditory distance perception and its neuronal mechanisms are poorly understood, mainly because 1) it is difficult to separate distance processing from intensity processing, 2) multiple intensity-independent distance cues are often available, and 3) the cues are combined in a context-dependent way. A recent fMRI study identified human auditory cortical area representing intensity-independent distance for sources presented along the interaural axis (Kopco et al., PNAS, 109, 11019-11024). For these sources, two intensity-independent cues are available, interaural level difference (ILD) and direct-to-reverberant energy ratio (DRR). Thus, the observed activations may have been contributed by not only distance-related, but also direction-encoding neuron populations sensitive to ILD. Here, the paradigm from the previous study was used to examine DRR-based distance representation for sounds originating in front of the listener, where ILD is not available. In a virtual environment, we performed behavioral and fMRI experiments, combined with computational analyses to identify the neural representation of distance based on DRR. The stimuli varied in distance (15-100 cm) while their received intensity was varied randomly and independently of distance. Behavioral performance showed that intensity-independent distance discrimination is accurate for frontal stimuli, even though it is worse than for lateral stimuli. fMRI activations for sounds varying in frontal distance, as compared to varying only in intensity, increased bilaterally in the posterior banks of Heschl's gyri, the planum temporale, and posterior superior temporal gyrus regions. Taken together, these results suggest that posterior human auditory cortex areas contain neuron populations that are sensitive to distance independent of intensity and of binaural cues relevant for directional hearing.HighlightsPosterior auditory cortices (AC) are sensitive to frontally presented distance cuesThese effects are independent of intensity- and direction-related binaural cuesfMRI activations to frontal distance cues are found in the right and left ACThe frontal reverberation-related auditory distance cues are behaviorally relevant

Mixed Reality ◽  
1999 ◽  
pp. 201-214 ◽  
Author(s):  
Jack M. Loomis ◽  
Roberta L. Klatzky ◽  
Reginald G. Golledge

2021 ◽  
Vol 12 (1) ◽  
pp. 348
Author(s):  
Vincent Martin ◽  
Isabelle Viaud-Delmon ◽  
Olivier Warusfel

Audio-only augmented reality consists of enhancing a real environment with virtual sound events. A seamless integration of the virtual events within the environment requires processing them with artificial spatialization and reverberation effects that simulate the acoustic properties of the room. However, in augmented reality, the visual and acoustic environment of the listener may not be fully mastered. This study aims to gain some insight into the acoustic cues (intensity and reverberation) that are used by the listener to form an auditory distance judgment, and to observe if these strategies can be influenced by the listener’s environment. To do so, we present a perceptual evaluation of two distance-rendering models informed by a measured Spatial Room Impulse Response. The choice of the rendering methods was made to design stimuli categories in which the availability and reproduction quality of acoustic cues are different. The proposed models have been evaluated in an online experiment gathering 108 participants who were asked to provide judgments of auditory distance about a stationary source. To evaluate the importance of environmental cues, participants had to describe the environment in which they were running the experiment, and more specifically the volume of the room and the distance to the wall they were facing. It could be shown that these context cues had a limited, but significant, influence on the perceived auditory distance.


Acta Acustica ◽  
2021 ◽  
Vol 5 ◽  
pp. 10
Author(s):  
Johannes M. Arend ◽  
Heinrich R. Liesefeld ◽  
Christoph Pörschmann

Nearby sound sources provide distinct binaural cues, mainly in the form of interaural level differences, which vary with respect to distance and azimuth. However, there is a long-standing controversy regarding whether humans can actually utilize binaural cues for distance estimation of nearby sources. Therefore, we conducted three experiments using non-individual binaural synthesis. In Experiment 1, subjects had to estimate the relative distance of loudness-normalized and non-normalized nearby sources in static and dynamic binaural rendering in a multi-stimulus comparison task under anechoic conditions. Loudness normalization was used as a plausible method to compensate for noticeable intensity differences between stimuli. With the employed loudness normalization, nominal distance did not significantly affect distance ratings for most conditions despite the presence of non-individual binaural distance cues. In Experiment 2, subjects had to judge the relative distance between loudness-normalized sources in dynamic binaural rendering in a forced-choice task. Below chance performance in this more sensitive task revealed that the employed loudness normalization strongly affected distance estimation. As this finding indicated a general issue with loudness normalization for studies on relative distance estimation, Experiment 3 directly tested the validity of loudness normalization and a frequently used amplitude normalization. Results showed that both normalization methods lead to remaining (incorrect) intensity cues, which subjects most likely used for relative distance estimation. The experiments revealed that both examined normalization methods have consequential drawbacks. These drawbacks might in parts explain conflicting findings regarding the effectiveness of binaural cues for relative distance estimation in the literature.


2009 ◽  
Vol 101 (4) ◽  
pp. 1781-1799 ◽  
Author(s):  
Brian H. Scott ◽  
Brian J. Malone ◽  
Malcolm N. Semple

Neurons in auditory cortex of awake primates are selective for the spatial location of a sound source, yet the neural representation of the binaural cues that underlie this tuning remains undefined. We examined this representation in 283 single neurons across the low-frequency auditory core in alert macaques, trained to discriminate binaural cues for sound azimuth. In response to binaural beat stimuli, which mimic acoustic motion by modulating the relative phase of a tone at the two ears, these neurons robustly modulate their discharge rate in response to this directional cue. In accordance with prior studies, the preferred interaural phase difference (IPD) of these neurons typically corresponds to azimuthal locations contralateral to the recorded hemisphere. Whereas binaural beats evoke only transient discharges in anesthetized cortex, neurons in awake cortex respond throughout the IPD cycle. In this regard, responses are consistent with observations at earlier stations of the auditory pathway. Discharge rate is a band-pass function of the frequency of IPD modulation in most neurons (73%), but both discharge rate and temporal synchrony are independent of the direction of phase modulation. When subjected to a receiver operator characteristic analysis, the responses of individual neurons are insufficient to account for the perceptual acuity of these macaques in an IPD discrimination task, suggesting the need for neural pooling at the cortical level.


2006 ◽  
Vol 18 (11) ◽  
pp. 1877-1888 ◽  
Author(s):  
Michael A. Kraut ◽  
Jeffery A. Pitcock ◽  
Vince Calhoun ◽  
Juan Li ◽  
Thomas Freeman ◽  
...  

The neural interface between sensory perception and memory is a central issue in neuroscience, particularly initial memory organization following perceptual analyses. We used functional magnetic resonance imaging to identify anatomic regions extracting initial auditory semantic memory information related to environmental sounds. Two distinct anatomic foci were detected in the right superior temporal gyrus when subjects identified sounds representing either animals or threatening items. Threatening animal stimuli elicited signal changes in both foci, suggesting a distributed neural representation. Our results demonstrate both category- and feature-specific responses to nonverbal sounds in early stages of extracting semantic memory information from these sounds. This organization allows for these category-feature detection nodes to extract early, semantic memory information for efficient processing of transient sound stimuli. Neural regions selective for threatening sounds are similar to those of nonhuman primates, demonstrating semantic memory organization for basic biological/survival primitives are present across species.


Perception ◽  
10.1068/p7153 ◽  
2012 ◽  
Vol 41 (2) ◽  
pp. 175-192 ◽  
Author(s):  
Esteban R Calcagno ◽  
Ezequiel L Abregú ◽  
Manuel C Eguía ◽  
Ramiro Vergara

In humans, multisensory interaction is an important strategy for improving the detection of stimuli of different nature and reducing the variability of response. It is known that the presence of visual information affects the auditory perception in the horizontal plane (azimuth), but there are few researches that study the influence of vision in the auditory distance perception. In general, the data obtained from these studies are contradictory and do not completely define the way in which visual cues affect the apparent distance of a sound source. Here psychophysical experiments on auditory distance perception in humans are performed, including and excluding visual cues. The results show that the apparent distance from the source is affected by the presence of visual information and that subjects can store in their memory a representation of the environment that later improves the perception of distance.


Sign in / Sign up

Export Citation Format

Share Document