scholarly journals The Roles of Exosomes in Visual and Auditory Systems

Author(s):  
Pei Jiang ◽  
Shasha Zhang ◽  
Cheng Cheng ◽  
Song Gao ◽  
Mingliang Tang ◽  
...  
Keyword(s):  
2019 ◽  
Vol 4 (6) ◽  
pp. 1418-1422
Author(s):  
Bre Myers ◽  
J. Andrew Dundas

Purpose The primary aim of the current article is to provide a brief review of the literature regarding the effects of noise exposure on the vestibular and balance control systems. Although the deleterious effects of noise on the auditory system are widely known and continue to be an active area of research, much less is known regarding the effects of noise on the peripheral vestibular system. Audiologists with working knowledge of how both systems interact and overlap are better prepared to provide comprehensive care to more patients as assessment of both the auditory and vestibular systems has been in the audiologists' scope of practice since 1992. Method A narrative review summarizes salient findings from the archival literature. Results Temporary and permanent effects on vestibular system function have been documented in multiple studies. Hearing conservation, vestibular impairment, and fall risk reduction may be more intimately related than previously considered. Conclusions A full appreciation of both the vestibular and auditory systems is necessary to address the growing and aging noise-exposed population. More cross-system studies are needed to further define the complex relationship between the auditory and vestibular systems to improve comprehensive patient care.


2020 ◽  
Vol 44 (1) ◽  
pp. 35-50
Author(s):  
Anna Barth ◽  
Leif Karlstrom ◽  
Benjamin K. Holtzman ◽  
Arthur Paté ◽  
Avinash Nayak

Abstract Sonification of time series data in natural science has gained increasing attention as an observational and educational tool. Sound is a direct representation for oscillatory data, but for most phenomena, less direct representational methods are necessary. Coupled with animated visual representations of the same data, the visual and auditory systems can work together to identify complex patterns quickly. We developed a multivariate data sonification and visualization approach to explore and convey patterns in a complex dynamic system, Lone Star Geyser in Yellowstone National Park. This geyser has erupted regularly for at least 100 years, with remarkable consistency in the interval between eruptions (three hours) but with significant variations in smaller scale patterns between each eruptive cycle. From a scientific standpoint, the ability to hear structures evolving over time in multiparameter data permits the rapid identification of relationships that might otherwise be overlooked or require significant processing to find. The human auditory system is adept at physical interpretation of call-and-response or causality in polyphonic sounds. Methods developed here for oscillatory and nonstationary data have great potential as scientific observational and educational tools, for data-driven composition with scientific and artistic intent, and towards the development of machine learning tools for pattern identification in complex data.


1998 ◽  
Vol 21 (2) ◽  
pp. 241-259 ◽  
Author(s):  
Harvey M. Sussman ◽  
David Fruchter ◽  
Jon Hilbert ◽  
Joseph Sirosh

Neuroethological investigations of mammalian and avian auditory systems have documented species-specific specializations for processing complex acoustic signals that could, if viewed in abstract terms, have an intriguing and striking relevance for human speech sound categorization and representation. Each species forms biologically relevant categories based on combinatorial analysis of information-bearing parameters within the complex input signal. This target article uses known neural models from the mustached bat and barn owl to develop, by analogy, a conceptualization of human processing of consonant plus vowel sequences that offers a partial solution to the noninvariance dilemma – the nontransparent relationship between the acoustic waveform and the phonetic segment. Critical input sound parameters used to establish species-specific categories in the mustached bat and barn owl exhibit high correlation and linearity due to physical laws. A cue long known to be relevant to the perception of stop place of articulation is the second formant (F2) transition. This article describes an empirical phenomenon – the locus equations – that describes the relationship between the F2 of a vowel and the F2 measured at the onset of a consonant-vowel (CV) transition. These variables, F2 onset and F2 vowel within a given place category, are consistently and robustly linearly correlated across diverse speakers and languages, and even under perturbation conditions as imposed by bite blocks. A functional role for this category-level extreme correlation and linearity (the “orderly output constraint”) is hypothesized based on the notion of an evolutionarily conserved auditory-processing strategy. High correlation and linearity between critical parameters in the speech signal that help to cue place of articulation categories might have evolved to satisfy a preadaptation by mammalian auditory systems for representing tightly correlated, linearly related components of acoustic signals.


2015 ◽  
Vol 46 (2) ◽  
pp. 441-449 ◽  
Author(s):  
Miguel F. Gago ◽  
Vítor Fernandes ◽  
Jaime Ferreira ◽  
Darya Yelshyna ◽  
Hélder David Silva ◽  
...  

1994 ◽  
Vol 57 (4) ◽  
pp. 127-130 ◽  
Author(s):  
Tal Jarus

The use of Morse code in rehabilitation applications is usually taught by visual or auditory methods. Yet, people experienced in Morse code use in land-line and radio telegraphy suggest that encoding and decoding rates can be enhanced through primary reliance on auditory methods for mastering the code. This study investigated the best way to learn Morse code. Sixty healthy adults with no preliminary knowledge of Morse code, ages 18 to 30 years, participated. Subjects were randomly divided into three groups to learn the Morse code through three different methods: visual chart reference method; auditory method using computer software; and combined method. After the practice period, the encoding rate and accuracy were tested using a handwriting test. One-way analysis of variance was used for each of the two measurements: time and error. Subjects from the combined method group were significantly faster than subjects from the visual method, and had significantly fewer errors than subjects in the auditory method. Therefore, if both time and accuracy of conveyance are important, it appears that learning through both the visual and the auditory systems allow the subjects best to internalise the codes as language. These conclusions should apply not only for the teaching of clients, but also when mastering the Morse code as clinicians.


2011 ◽  
Vol 106 (1) ◽  
pp. 4-14 ◽  
Author(s):  
R. Michael Burger ◽  
Iwao Fukui ◽  
Harunori Ohmori ◽  
Edwin W. Rubel

Interaural time differences (ITDs) are the primary cue animals, including humans, use to localize low-frequency sounds. In vertebrate auditory systems, dedicated ITD processing neural circuitry performs an exacting task, the discrimination of microsecond differences in stimulus arrival time at the two ears by coincidence-detecting neurons. These neurons modulate responses over their entire dynamic range to sounds differing in ITD by mere hundreds of microseconds. The well-understood function of this circuitry in birds has provided a fruitful system to investigate how inhibition contributes to neural computation at the synaptic, cellular, and systems level. Our recent studies in the chicken have made significant progress in bringing together many of these findings to provide a cohesive picture of inhibitory function.


Author(s):  
Yun Doo Chung ◽  
Jeongmi Lee

Hearing in invertebrates has evolved independently as an adaptation to avoid predators or to mediate intraspecific communication. Although many invertebrate groups are able to respond to sound stimuli, insects are the only group in which hearing is widely used. Therefore, we will focus here on the auditory systems of some well-known insect models. Appearance of the ability to perceive sound in insects is presumably a quite recent event in evolution. As a result of independent evolution, diverse types of hearing organs are evolved in insects. Here we will introduce basic features of insect ears and the mechanisms through which sound stimuli are converted into neuronal electric signals. We will also summarize our current understanding of neural processing of auditory information, including tonotopy, sound localization, and pattern recognition.


2008 ◽  
pp. 227-282 ◽  
Author(s):  
Whitlow W.L. Au ◽  
Mardi C. Hastings

PEDIATRICS ◽  
1973 ◽  
Vol 51 (1) ◽  
pp. 152-152
Author(s):  
Warren C. Gregory

I agree with Kenny et al.1 on only one thing: the history, neurologic exam, and electroencephalogram have little to contribute to the evaluation of dyslexia. Medical evaluation is all important in early identification of dyslexic children, and the unmasking of school failures who come to their physician with psychosomatic, physical, or emotional complaints, or as school failures. The physician who handles such a problem should not only have expert knowledge of the reasons for school failure but he should have an approach to the evaluation that will identify the strengths and weaknesses of the visual and auditory systems as relate to language functioning.


Sign in / Sign up

Export Citation Format

Share Document