scholarly journals Age-related dissociation of sensory and decision-based auditory motion processing

Author(s):  
Alexandra A. Ludwig ◽  
Rudolf Rübsamen ◽  
Gerd J. Dörrscheidt ◽  
Sonja A. Kotz
2007 ◽  
Vol 45 (3) ◽  
pp. 523-530 ◽  
Author(s):  
A. Brooks ◽  
R. van der Zwan ◽  
A. Billard ◽  
B. Petreska ◽  
S. Clarke ◽  
...  

2020 ◽  
Vol 14 ◽  
Author(s):  
Yuko Sugita ◽  
Haruka Yamamoto ◽  
Yamato Maeda ◽  
Takahisa Furukawa

The decline in visual function due to normal aging impacts various aspects of our daily lives. Previous reports suggest that the aging retina exhibits mislocalization of photoreceptor terminals and reduced amplitudes of scotopic and photopic electroretinogram (ERG) responses in mice. These abnormalities are thought to contribute to age-related visual impairment; however, the extent to which visual function is impaired by aging at the organismal level is unclear. In the present study, we focus on the age-related changes of the optokinetic responses (OKRs) in visual processing. Moreover, we investigated the initial and late phases of the OKRs in young adult (2–3 months old) and aging mice (21–24 months old). The initial phase was evaluated by measuring the open-loop eye velocity of OKRs using sinusoidal grating patterns of various spatial frequencies (SFs) and moving at various temporal frequencies (TFs) for 0.5 s. The aging mice exhibited initial OKRs with a spatiotemporal frequency tuning that was slightly different from those in young adult mice. The late-phase OKRs were investigated by measuring the slow-phase velocity of the optokinetic nystagmus evoked by sinusoidal gratings of various spatiotemporal frequencies moving for 30 s. We found that optimal SF and TF in the normal aging mice are both reduced compared with those in young adult mice. In addition, we measured the OKRs of 4.1G-null (4.1G–/–) mice, in which mislocalization of photoreceptor terminals is observed even at the young adult stage. We found that the late phase OKR was significantly impaired in 4.1G–/– mice, which exhibit significantly reduced SF and TF compared with control mice. These OKR abnormalities observed in 4.1G–/– mice resemble the abnormalities found in normal aging mice. This finding suggests that these mice can be useful mouse models for studying the aging of the retinal tissue and declining visual function. Taken together, the current study demonstrates that normal aging deteriorates to visual motion processing for both the initial and late phases of OKRs. Moreover, it implies that the abnormalities of the visual function in the normal aging mice are at least partly due to mislocalization of photoreceptor synapses.


2013 ◽  
Vol 109 (2) ◽  
pp. 321-331 ◽  
Author(s):  
David A. Magezi ◽  
Karin A. Buetler ◽  
Leila Chouiter ◽  
Jean-Marie Annoni ◽  
Lucas Spierer

Following prolonged exposure to adaptor sounds moving in a single direction, participants may perceive stationary-probe sounds as moving in the opposite direction [direction-selective auditory motion aftereffect (aMAE)] and be less sensitive to motion of any probe sounds that are actually moving (motion-sensitive aMAE). The neural mechanisms of aMAEs, and notably whether they are due to adaptation of direction-selective motion detectors, as found in vision, is presently unknown and would provide critical insight into auditory motion processing. We measured human behavioral responses and auditory evoked potentials to probe sounds following four types of moving-adaptor sounds: leftward and rightward unidirectional, bidirectional, and stationary. Behavioral data replicated both direction-selective and motion-sensitive aMAEs. Electrical neuroimaging analyses of auditory evoked potentials to stationary probes revealed no significant difference in either global field power (GFP) or scalp topography between leftward and rightward conditions, suggesting that aMAEs are not based on adaptation of direction-selective motion detectors. By contrast, the bidirectional and stationary conditions differed significantly in the stationary-probe GFP at 200 ms poststimulus onset without concomitant topographic modulation, indicative of a difference in the response strength between statistically indistinguishable intracranial generators. The magnitude of this GFP difference was positively correlated with the magnitude of the motion-sensitive aMAE, supporting the functional relevance of the neurophysiological measures. Electrical source estimations revealed that the GFP difference followed from a modulation of activity in predominantly right hemisphere frontal-temporal-parietal brain regions previously implicated in auditory motion processing. Our collective results suggest that auditory motion processing relies on motion-sensitive, but, in contrast to vision, non-direction-selective mechanisms.


2014 ◽  
Vol 14 (13) ◽  
pp. 4-4 ◽  
Author(s):  
F. Jiang ◽  
G. C. Stecker ◽  
I. Fine

2014 ◽  
Vol 40 (3) ◽  
pp. 265-272 ◽  
Author(s):  
L. B. Shestopalova ◽  
E. A. Petropavlovskaia ◽  
S. Ph. Vaitulevich ◽  
N. I. Nikitin

2018 ◽  
Author(s):  
Ceren Battal ◽  
Mohamed Rezk ◽  
Stefania Mattioni ◽  
Jyothirmayi Vadlamudi ◽  
Olivier Collignon

ABSTRACTThe ability to compute the location and direction of sounds is a crucial perceptual skill to efficiently interact with dynamic environments. How the human brain implements spatial hearing is however poorly understood. In our study, we used fMRI to characterize the brain activity of male and female humans listening to left, right, up and down moving as well as static sounds. Whole brain univariate results contrasting moving and static sounds varying in their location revealed a robust functional preference for auditory motion in bilateral human Planum Temporale (hPT). Using independently localized hPT, we show that this region contains information about auditory motion directions and, to a lesser extent, sound source locations. Moreover, hPT showed an axis of motion organization reminiscent of the functional organization of the middle-temporal cortex (hMT+/V5) for vision. Importantly, whereas motion direction and location rely on partially shared pattern geometries in hPT, as demonstrated by successful cross-condition decoding, the responses elicited by static and moving sounds were however significantly distinct. Altogether our results demonstrate that the hPT codes for auditory motion and location but that the underlying neural computation linked to motion processing is more reliable and partially distinct from the one supporting sound source location.SIGNIFICANCE STATEMENTIn comparison to what we know about visual motion, little is known about how the brain implements spatial hearing. Our study reveals that motion directions and sound source locations can be reliably decoded in the human Planum Temporale (hPT) and that they rely on partially shared pattern geometries. Our study therefore sheds important new lights on how computing the location or direction of sounds are implemented in the human auditory cortex by showing that those two computations rely on partially shared neural codes. Furthermore, our results show that the neural representation of moving sounds in hPT follows a “preferred axis of motion” organization, reminiscent of the coding mechanisms typically observed in the occipital hMT+/V5 region for computing visual motion.


2010 ◽  
Vol 77 (3) ◽  
pp. 328-329
Author(s):  
L.B. Shestopalova ◽  
E.A. Petropavlovskaia ◽  
S.Ph. Vaitulevich ◽  
Y.A. Vasilenko

2001 ◽  
Vol 85 (1) ◽  
pp. 23-33 ◽  
Author(s):  
Neil J. Ingham ◽  
Heledd C. Hart ◽  
David McAlpine

We examined responses from 91 single-neurons in the inferior colliculus (IC) of anesthetized guinea pigs to auditory apparent motion in the free field. Apparent motion was generated by presenting 100-ms tone bursts, separated by 50-ms silent intervals, at consecutive speaker positions in an array of 11 speakers, positioned in an arc ±112.5° around midline. Most neurons demonstrated discrete spatial receptive fields (SRFs) to apparent motion in the clockwise and anti-clockwise directions. However, SRFs showed marked differences for apparent motion in opposite directions. In virtually all neurons, mean best azimuthal positions for SRFs to opposite directions occurred at earlier positions in the motion sweep, producing receptive fields to the two directions of motion that only partially overlapped. Despite this, overall spike counts to the two directions were similar for equivalent angular velocities. Responses of 28 neurons were recorded to stimuli with different duration silent intervals between speaker presentations, mimicking different apparent angular velocities. Increasing the stimulus off time increased neuronal discharge rates, particularly at later portions of the apparent motion sweep, and reduced the differences in the SRFs to opposite motion directions. Consequently SRFs to both directions broadened and converged with decreasing motion velocity. This expansion was most obvious on the outgoing side of the each SRF. Responses of 11 neurons were recorded to short (90°) partially overlapping apparent motion sweeps centered at different spatial positions. Nonoverlapping response profiles were recorded in 9 of the 11 neurons tested and confirmed that responses at each speaker position were dependent on the preceding response history. Together these data are consistent with the suggestion that a mechanism of adaptation of excitation contributes to the apparent sensitivity of IC neurons to auditory motion cues. In addition, the data indicate that the sequential activation of an array of speakers to produce apparent auditory motion may not be an optimal stimulus paradigm to separate the temporal and spatial aspects of auditory motion processing.


2003 ◽  
Vol 14 (4) ◽  
pp. 357-361 ◽  
Author(s):  
Jean Vroomen ◽  
Beatrice de Gelder

In this study, we show that the contingent auditory motion aftereffect is strongly influenced by visual motion information. During an induction phase, participants listened to rightward-moving sounds with falling pitch alternated with leftward-moving sounds with rising pitch (or vice versa). Auditory aftereffects (i.e., a shift in the psychometric function for unimodal auditory motion perception) were bigger when a visual stimulus moved in the same direction as the sound than when no visual stimulus was presented. When the visual stimulus moved in the opposite direction, aftereffects were reversed and thus became contingent upon visual motion. When visual motion was combined with a stationary sound, no aftereffect was observed. These findings indicate that there are strong perceptual links between the visual and auditory motion-processing systems.


Sign in / Sign up

Export Citation Format

Share Document