scholarly journals Integration of locomotion and auditory signals in the mouse inferior colliculus

eLife ◽  
2020 ◽  
Vol 9 ◽  
Author(s):  
Yoonsun Yang ◽  
Joonyeol Lee ◽  
Gunsoo Kim

The inferior colliculus (IC) is the major midbrain auditory integration center, where virtually all ascending auditory inputs converge. Although the IC has been extensively studied for sound processing, little is known about the neural activity of the IC in moving subjects, as frequently happens in natural hearing conditions. Here, by recording neural activity in walking mice, we show that the activity of IC neurons is strongly modulated by locomotion, even in the absence of sound stimuli. Similar modulation was also found in hearing-impaired mice, demonstrating that IC neurons receive non-auditory, locomotion-related neural signals. Sound-evoked activity was attenuated during locomotion, and this attenuation increased frequency selectivity across the neuronal population, while maintaining preferred frequencies. Our results suggest that during behavior, integrating movement-related and auditory information is an essential aspect of sound processing in the IC.

2019 ◽  
Author(s):  
Yoonsun Yang ◽  
Joonyeol Lee ◽  
Gunsoo Kim

AbstractThe inferior colliculus (IC) is the major midbrain auditory integration center, where virtually all ascending auditory inputs converge. Although the IC has been extensively studied for sound processing, little is known about the neural activity of the IC in moving subjects, as frequently happens in natural hearing conditions. Here we show, by recording the IC neural activity in walking mice, the activity of IC neurons is strongly modulated by locomotion in the absence of sound stimulus presentation. Similar modulation was also found in deafened mice, demonstrating that IC neurons receive non-auditory, locomotion-related neural signals. Sound-evoked activity was attenuated during locomotion, and the attenuation increased frequency selectivity across the population, while maintaining preferred frequencies. Our results suggest that during behavior, integrating movement-related and auditory information is an essential aspect of sound processing in the IC.


Author(s):  
Laura Hurley

The inferior colliculus (IC) receives prominent projections from centralized neuromodulatory systems. These systems include extra-auditory clusters of cholinergic, dopaminergic, noradrenergic, and serotonergic neurons. Although these modulatory sites are not explicitly part of the auditory system, they receive projections from primary auditory regions and are responsive to acoustic stimuli. This bidirectional influence suggests the existence of auditory-modulatory feedback loops. A characteristic of neuromodulatory centers is that they integrate inputs from anatomically widespread and functionally diverse sets of brain regions. This connectivity gives neuromodulatory systems the potential to import information into the auditory system on situational variables that accompany acoustic stimuli, such as context, internal state, or experience. Once released, neuromodulators functionally reconfigure auditory circuitry through a variety of receptors expressed by auditory neurons. In addition to shaping ascending auditory information, neuromodulation within the IC influences behaviors that arise subcortically, such as prepulse inhibition of the startle response. Neuromodulatory systems therefore provide a route for integrative behavioral information to access auditory processing from its earliest levels.


2021 ◽  
Vol 4 (1) ◽  
Author(s):  
Soren Wainio-Theberge ◽  
Annemarie Wolff ◽  
Georg Northoff

AbstractSpontaneous neural activity fluctuations have been shown to influence trial-by-trial variation in perceptual, cognitive, and behavioral outcomes. However, the complex electrophysiological mechanisms by which these fluctuations shape stimulus-evoked neural activity remain largely to be explored. Employing a large-scale magnetoencephalographic dataset and an electroencephalographic replication dataset, we investigate the relationship between spontaneous and evoked neural activity across a range of electrophysiological variables. We observe that for high-frequency activity, high pre-stimulus amplitudes lead to greater evoked desynchronization, while for low frequencies, high pre-stimulus amplitudes induce larger degrees of event-related synchronization. We further decompose electrophysiological power into oscillatory and scale-free components, demonstrating different patterns of spontaneous-evoked correlation for each component. Finally, we find correlations between spontaneous and evoked time-domain electrophysiological signals. Overall, we demonstrate that the dynamics of multiple electrophysiological variables exhibit distinct relationships between their spontaneous and evoked activity, a result which carries implications for experimental design and analysis in non-invasive electrophysiology.


2021 ◽  
Author(s):  
Kaosu Matsumori ◽  
Kazuki Iijima ◽  
Yukihito Yomogida ◽  
Kenji Matsumoto

Aggregating welfare across individuals to reach collective decisions is one of the most fundamental problems in our society. Interpersonal comparison of utility is pivotal and inevitable for welfare aggregation, because if each person's utility is not interpersonally comparable, there is no rational aggregation procedure that simultaneously satisfies even some very mild conditions for validity (Arrow's impossibility theorem). However, scientific methods for interpersonal comparison of utility have thus far not been available. Here, we have developed a method for interpersonal comparison of utility based on brain signals, by measuring the neural activity of participants performing gambling tasks. We found that activity in the medial frontal region was correlated with changes in expected utility, and that, for the same amount of money, the activity evoked was larger for participants with lower household incomes than for those with higher household incomes. Furthermore, we found that the ratio of neural signals from lower-income participants to those of higher-income participants coincided with estimates of their psychological pleasure by "impartial spectators", i.e. disinterested third-party participants satisfying specific conditions. Finally, we derived a decision rule based on aggregated welfare from our experimental data, and confirmed that it was applicable to a distribution problem. These findings suggest that our proposed method for interpersonal comparison of utility enables scientifically reasonable welfare aggregation by escaping from Arrow's impossibility and has implications for the fair distribution of economic goods. Our method can be further applied for evidence-based policy making in nations that use cost-benefit analyses or optimal taxation theory for policy evaluation.


2021 ◽  
Author(s):  
Alice Gomez ◽  
Guillaume Lio ◽  
Manuela Costa ◽  
Angela Sirigu ◽  
Caroline Demily

Abstract Williams syndrome (WS) and Autism Spectrum Disorders (ASD) are psychiatric conditions associated with atypical but opposite face-to-face interactions patterns: WS patients overly stare at others, ASD individuals escape eye contact. Whether these behaviors result from dissociable visual processes within the occipito-temporal pathways is unknown. Using high-density electroencephalography, multivariate pattern classification and group blind source separation, we searched for face-related neural signals that could best discriminate WS (N = 14), ASD (N = 14) and neurotypical populations (N = 14). We found two peaks in neurotypical participants: the first at 170ms, an early signal known to be implicated in low-level face features, the second at 260ms, a late component implicated in decoding salient face social cues. The late 260ms signal varied as a function of the distance of the eyes in the face stimulus with respect to the viewers’ fovea, meaning that it was strongest when the eyes were projected on the fovea and weakest when projected in the retinal periphery. Remarkably, both components were found distinctly impaired and preserved in WS and ASD. In WS, we could weakly decode the 170ms signal probably due to their relatively poor ability to process faces’ morphology while the late 260ms component shown to be eye sensitive was highly significant. The reverse pattern was observed in ASD participants who showed neurotypical like early 170ms evoked activity but impaired late evoked 260ms signal. Our study reveals a dissociation between WS and ASD patients and point at different neural origins for their social impairments.


Author(s):  
Cheng Lyu ◽  
L.F. Abbott ◽  
Gaby Maimon

AbstractMany behavioral tasks require the manipulation of mathematical vectors, but, outside of computational models1–8, it is not known how brains perform vector operations. Here we show how the Drosophila central complex, a region implicated in goal-directed navigation8–14, performs vector arithmetic. First, we describe neural signals in the fan-shaped body that explicitly track a fly’s allocentric traveling direction, that is, the traveling direction in reference to external cues. Past work has identified neurons in Drosophila12,15–17 and mammals18,19 that track allocentric heading (e.g., head-direction cells), but these new signals illuminate how the sense of space is properly updated when traveling and heading angles differ. We then characterize a neuronal circuit that rotates, scales, and adds four vectors related to the fly’s egocentric traveling direction–– the traveling angle referenced to the body axis––to compute the allocentric traveling direction. Each two-dimensional vector is explicitly represented by a sinusoidal activity pattern across a distinct neuronal population, with the sinusoid’s amplitude representing the vector’s length and its phase representing the vector’s angle. The principles of this circuit, which performs an egocentric-to-allocentric coordinate transformation, may generalize to other brains and to domains beyond navigation where vector operations or reference-frame transformations are required.


1980 ◽  
Vol 23 (3) ◽  
pp. 603-613 ◽  
Author(s):  
Robert H. Margolis ◽  
Seth M. Goldberg

Auditory frequency selectivity was inferred from measurements of the detectability of tonal signals as a function of the cutoff frequency of a low-pass computer-generated noise masker. In Experiment I the effect of small changes in signal-to-noise ratio on inferred auditory frequency selectivity was studied. In Experiment II, frequency selectivity was determined for five normal-hearing subjects and four subjects with sensorineural hearing loss due to presbycusis. Critical ratios (signal-to-noise ratio at masked threshold) also were determined in Experiment II. The results of Experiment I indicate that the low-pass masking experiment provides a stable estimate of the width, but not the position, of the critical masking band. Experiment II revealed elevated critical ratios for three of the four presbycusic subjects. Some hearing-impaired subjects appeared to have normal frequency selectivity despite elevated critical ratios. Other presbycusic subjects demonstrated impaired auditory frequency selectivity. The results suggest that critical ratio and critical masking band data are free to vary independently in hearing-impaired subjects.


eLife ◽  
2015 ◽  
Vol 4 ◽  
Author(s):  
Sarah G Leinwand ◽  
Claire J Yang ◽  
Daphne Bazopoulou ◽  
Nikos Chronis ◽  
Jagan Srinivasan ◽  
...  

Chemosensory neurons extract information about chemical cues from the environment. How is the activity in these sensory neurons transformed into behavior? Using Caenorhabditis elegans, we map a novel sensory neuron circuit motif that encodes odor concentration. Primary neurons, AWCON and AWA, directly detect the food odor benzaldehyde (BZ) and release insulin-like peptides and acetylcholine, respectively, which are required for odor-evoked responses in secondary neurons, ASEL and AWB. Consistently, both primary and secondary neurons are required for BZ attraction. Unexpectedly, this combinatorial code is altered in aged animals: odor-evoked activity in secondary, but not primary, olfactory neurons is reduced. Moreover, experimental manipulations increasing neurotransmission from primary neurons rescues aging-associated neuronal deficits. Finally, we correlate the odor responsiveness of aged animals with their lifespan. Together, these results show how odors are encoded by primary and secondary neurons and suggest reduced neurotransmission as a novel mechanism driving aging-associated sensory neural activity and behavioral declines.


2013 ◽  
Vol 24 (04) ◽  
pp. 258-273 ◽  
Author(s):  
Ken W. Grant ◽  
Therese C. Walden

Background: Traditional audiometric measures, such as pure-tone thresholds or unaided word-recognition in quiet, appear to be of marginal use in predicting speech understanding by hearing-impaired (HI) individuals in background noise with or without amplification. Suprathreshold measures of auditory function (tolerance of noise, temporal and frequency resolution) appear to contribute more to success with amplification and may describe more effectively the distortion component of hearing. However, these measures are not typically measured clinically. When combined with measures of audibility, suprathreshold measures of auditory distortion may provide a much more complete understanding of speech deficits in noise by HI individuals. Purpose: The primary goal of this study was to investigate the relationship among measures of speech recognition in noise, frequency selectivity, temporal acuity, modulation masking release, and informational masking in adult and elderly patients with sensorineural hearing loss to determine whether peripheral distortion for suprathreshold sounds contributes to the varied outcomes experienced by patients with sensorineural hearing loss listening to speech in noise. Research Design: A correlational study. Study Sample: Twenty-seven patients with sensorineural hearing loss and four adults with normal hearing were enrolled in the study. Data Collection and Analysis: The data were collected in a sound attenuated test booth. For speech testing, subjects' verbal responses were scored by the experimenter and entered into a custom computer program. For frequency selectivity and temporal acuity measures, subject responses were recorded via a touch screen. Simple correlation, step-wise multiple linear regression analyses and a repeated analysis of variance were performed. Results: Results showed that the signal-to-noise ratio (SNR) loss could only be partially predicted by a listener's thresholds or audibility measures such as the Speech Intelligibility Index (SII). Correlations between SII and SNR loss were higher using the Hearing-in-Noise Test (HINT) than the Quick Speech-in-Noise test (QSIN) with the SII accounting for 71% of the variance in SNR loss for the HINT but only 49% for the QSIN. However, listener age and the addition of suprathreshold measures improved the prediction of SNR loss using the QSIN, accounting for nearly 71% of the variance. Conclusions: Two standard clinical speech-in-noise tests, QSIN and HINT, were used in this study to obtain a measure of SNR loss. When administered clinically, the QSIN appears to be less redundant with hearing thresholds than the HINT and is a better indicator of a patient's suprathreshold deficit and its impact on understanding speech in noise. Additional factors related to aging, spectral resolution, and, to a lesser extent, temporal resolution improved the ability to predict SNR loss measured with the QSIN. For the HINT, a listener's audibility and age were the only two significant factors. For both QSIN and HINT, roughly 25–30% of the variance in individual differences in SNR loss (i.e., the dB difference in SNR between an individual HI listener and a control group of NH listeners at a specified performance level, usually 50% word or sentence recognition) remained unexplained, suggesting the need for additional measures of suprathreshold acuity (e.g., sensitivity to temporal fine structure) or cognitive function (e.g., memory and attention) to further improve the ability to understand individual variability in SNR loss.


Sign in / Sign up

Export Citation Format

Share Document