Auditory Influences on Visual Temporal Rate Perception

2003 ◽  
Vol 89 (2) ◽  
pp. 1078-1093 ◽  
Author(s):  
Gregg H. Recanzone

Visual stimuli are known to influence the perception of auditory stimuli in spatial tasks, giving rise to the ventriloquism effect. These influences can persist in the absence of visual input following a period of exposure to spatially disparate auditory and visual stimuli, a phenomenon termed the ventriloquism aftereffect. It has been speculated that the visual dominance over audition in spatial tasks is due to the superior spatial acuity of vision compared with audition. If that is the case, then the auditory system should dominate visual perception in a manner analogous to the ventriloquism effect and aftereffect if one uses a task in which the auditory system has superior acuity. To test this prediction, the interactions of visual and auditory stimuli were measured in a temporally based task in normal human subjects. The results show that the auditory system has a pronounced influence on visual temporal rate perception. This influence was independent of the spatial location, spectral bandwidth, and intensity of the auditory stimulus. The influence was, however, strongly dependent on the disparity in temporal rate between the two stimulus modalities. Further, aftereffects were observed following approximately 20 min of exposure to temporally disparate auditory and visual stimuli. These results show that the auditory system can strongly influence visual perception and are consistent with the idea that bimodal sensory conflicts are dominated by the sensory system with the greater acuity for the stimulus parameter being discriminated.

1954 ◽  
Vol 100 (419) ◽  
pp. 462-477 ◽  
Author(s):  
K. R. L. Hall ◽  
E. Stride

A number of studies on reaction time (R.T.) latency to visual and auditory stimuli in psychotic patients has been reported since the first investigations on the personal equation were carried out. The general trends from the work up to 1943 are well summarized by Hunt (1944), while Granger's (1953) review of “Personality and visual perception” contains a summary of the studies on R.T. to visual stimuli.


1999 ◽  
Vol 11 (2) ◽  
pp. 206-213 ◽  
Author(s):  
Tracy L. Taylor ◽  
Raymond M. Klein ◽  
Douglas P. Munoz

Relative to when a fixated stimulus remains visible, saccadic latencies are facilitated when a fixated stimulus is extinguished simultaneously with or prior to the appearance of an eccentric auditory, visual, or combined visual-auditory target. In a study of nine human subjects, we determined whether such facilitation (the “gap effect”) occurs equivalently for the disappearance of fixated auditory stimuli and fixated visual stimuli. In the present study, a fixated auditory (noise) stimulus remained present (overlap) or else was extinguished simultaneously with (step) or 200 msec prior to (gap) the appearance of a visual, auditory (tone), or combined visual-auditory target 10° to the left or right of fixation. The results demonstrated equivalent facilitatory effects due to the disappearance of fixated auditory and visual stimuli and are consistent with the presumed role of the superior colliculus in the gap effect.


2012 ◽  
Vol 25 (0) ◽  
pp. 24
Author(s):  
Roberto Cecere ◽  
Benjamin De Haas ◽  
Harriett Cullen ◽  
Jon Driver ◽  
Vincenzo Romei

There is converging evidence that the duration of an auditory event can affect the perceived duration of a co-occurring visual event. When a brief visual stimulus is accompanied by a longer auditory stimulus, the perceived visual duration stretches. If this reflects a genuine sustain of visual stimulus perception, it should result in enhanced perception of non-temporal visual stimulus qualities. To test this hypothesis, in a temporal two-alternative forced choice task, 28 participants were asked to indicate whether a short (∼24 ms), peri-threshold, visual stimulus was presented in the first or in the second of two consecutive displays. Each display was accompanied by a sound of equal or longer duration (36, 48, 60, 72, 84, 96, 190 ms) than the visual stimulus. As a control condition, visual stimuli of different durations (matching auditory stimulus durations) were presented alone. We predicted that visual detection can improve as a function of sound duration. Moreover, if the expected cross-modal effect reflects sustained visual perception it should positively correlate with the improvement observed for genuinely longer visual stimuli. Results showed that detection sensitivity (d′) for the 24 ms visual stimulus was significantly enhanced when paired with longer auditory stimuli ranging from 60 to 96 ms duration. The visual detection performance dropped to baseline levels with 190 ms sounds. Crucially, the enhancement for auditory durations 60–96 ms significantly correlates with the d′ enhancement for visual stimuli lasting 60–96 ms in the control condition. We conclude that the duration of co-occurring auditory stimuli not only influences the perceived duration of visual stimuli but reflects a genuine sustain in visual perception.


1978 ◽  
Vol 44 (4) ◽  
pp. 447-458 ◽  
Author(s):  
Douglas S Goodin ◽  
Kenneth C Squires ◽  
Beverley H Henderson ◽  
Arnold Starr

Perception ◽  
10.1068/p5849 ◽  
2007 ◽  
Vol 36 (10) ◽  
pp. 1507-1512 ◽  
Author(s):  
Kerstin Königs ◽  
Jonas Knöll ◽  
Frank Bremmer

Previous studies have shown that the perceived location of visual stimuli briefly flashed during smooth pursuit, saccades, or optokinetic nystagmus (OKN) is not veridical. We investigated whether these mislocalisations can also be observed for brief auditory stimuli presented during OKN. Experiments were carried out in a lightproof sound-attenuated chamber. Participants performed eye movements elicited by visual stimuli. An auditory target (white noise) was presented for 5 ms. Our data clearly indicate that auditory targets are mislocalised during reflexive eye movements. OKN induces a shift of perceived location in the direction of the slow eye movement and is modulated in the temporal vicinity of the fast phase. The mislocalisation is stronger for look- as compared to stare-nystagmus. The size and temporal pattern of the observed mislocalisation are different from that found for visual targets. This suggests that different neural mechanisms are at play to integrate oculomotor signals and information on the spatial location of visual as well as auditory stimuli.


1973 ◽  
Vol 49 (2) ◽  
pp. 499 ◽  
Author(s):  
C. Umiltà ◽  
G. Rizzolatti ◽  
C.A. Marzi ◽  
G. Zamboni ◽  
C. Franzini ◽  
...  

2021 ◽  
Vol 3 (2) ◽  
pp. 95-102
Author(s):  
Ediwarman Ediwarman ◽  
Syafrizal Syafrizal ◽  
John Pahamzah

This paper exmined the perception of speech using audio visual and replica for students of Sultan Ageng Tirtayasa Univesity. This research was aimed at discussing face-to-face conversation or speech felt by the ears and eyes.  The prerequisites for audio-visual perception of speech by using ambiguous perceptual sine wave replicas of natural speech as auditory stimuli are studied in details. When the subjects were unaware that auditory stimuli were speech, they only showed a negligible integration of auditory and visual stimuli. The same subjects learn to feel the same auditory stimuli as speech; they integrate auditory and visual stimuli in the same way as natural speech. These research result suggests a special mode of perception of multisensory speech.


2014 ◽  
Vol 27 (3-4) ◽  
pp. 173-188 ◽  
Author(s):  
Stefania S. Moro ◽  
Laurence R. Harris ◽  
Jennifer K. E. Steeves

People with one eye show altered sensory processing. Such changes might reflect a central re-weighting of sensory information that might impact on how multisensory cues are integrated. We assessed whether people who lost an eye early in life differ from controls with respect to audiovisual integration. In order to quantify the relative weightings assigned to each sensory system, participants were asked to spatially localize audiovisual events that have been previously shown to be optimally combined and perceptually fused from the point of view of location in a normal population, where the auditory and visual components were spatially disparate. There was no difference in the variability of localizing unimodal visual and auditory targets by people with one eye compared to controls. People with one eye did however, demonstrate slower reaction times to localize visual stimuli compared to auditory stimuli and were slower than binocular and eye-patched control groups. When localizing bimodal targets, the weightings assigned to each sensory modality in both people with one eye and controls were predictable from their unimodal performance, in accordance with Maximum Likelihood Estimation and the time it took all three groups to localize the bimodal targets was faster than for vision alone. Regardless of demonstrating a longer response time to visual stimuli, people with one eye appear to integrate the auditory and visual components of multisensory events optimally when determining spatial location.


1990 ◽  
Vol 63 (3) ◽  
pp. 439-446 ◽  
Author(s):  
J. L. Taylor ◽  
D. I. McCloskey

1. Visual stimuli were presented to normal human subjects to test simple and more complex voluntary motor responses. Large and small visual stimuli were presented. In some trials, the small stimulus was followed 50 ms later by the large stimulus, so that the small stimulus was not perceived; this is the phenomenon of "backward masking." 2. Although subjects were not able to detect the masked, visual stimulus on forced-choice testing, they performed motor, reaction-time (RT) tasks in response to it. The RTs for responses to the masked stimulus were the same as those for responses to the easily perceived, nonmasked stimulus. 3. This result confirms and extends the findings of Fehrer and Biederman and was demonstrated with both simple and more complex motor responses. 4. Discussion of the findings focuses on their implications for motor control, particularly with respect to the preprogramming of voluntary movement.


Sign in / Sign up

Export Citation Format

Share Document