Temporal integration and the microstructure of the detection threshold curve

1979 ◽  
Vol 66 (S1) ◽  
pp. S8-S8
Author(s):  
Marion F. Cohen
Perception ◽  
1997 ◽  
Vol 26 (1_suppl) ◽  
pp. 216-216 ◽  
Author(s):  
H T Kukkonen ◽  
J Rovamo

In computer-generated spatiotemporal noise every stimulus frame contains a new static noise sample. The spectral density of white spatiotemporal noise is calculated by multiplying the squared rms contrast of noise by the product of the noise check area and the exposure duration of each noise check. When the exposure duration of each noise check is gradually increased, the spectral density of spatiotemporal noise increases, reaching its maximum when noise becomes static. In static spatial noise both stimulus and noise checks have the same duration. The signal-to-noise ratio is known to be constant at detection threshold. Detection thresholds should thus increase in proportion to the spectral density of spatiotemporal noise, which increases with the duration of the noise checks. We measured detection thresholds for stationary cosine gratings embedded in spatiotemporal noise. The exposure duration of the noise checks was increased from one frame duration to the total exposure duration of the stimulus grating. Noise was thus gradually transformed from spatiotemporal to static spatial noise. The contrast energy threshold increased in proportion to the spectral density of spatiotemporal noise up to a noise check duration found to be equal to the integration time for the stimulus grating without noise. After this, energy thresholds remained constant in spite of the increase in the spectral density of spatiotemporal noise. This suggests that the masking effect of spatiotemporal noise increases with the duration of noise checks up to the critical duration marking the saturation of the temporal integration of the signal.


1984 ◽  
Vol 27 (2) ◽  
pp. 252-256 ◽  
Author(s):  
Joseph W. Hall ◽  
Elizabeth J. Wood

Frequency discrimination for 500- and 2000-Hz pure tones at durations of 5, 10, 20, 50, and 200 ms was determined for 10 normal-hearing and 10 cochlear-impaired listeners. Listeners from both groups demonstrated monotonic increases in frequency difference limens as stimulus duration decreased. The functions of the hearing-impaired listeners were parallel to those of the normal-hearing listeners for stimulus durations between 10 and 200 ms, but the overall performance of the hearing-impaired group was poorer than that of the normal-hearing group. The functions of many of the cochlear-impaired subjects were less steep than normal for the shortest durations tested (between 5 and l0 ms). There appeared to be no relation between temporal integration for frequency discrimination and temporal integration for detection threshold. The results are discussed in terms of processes of temporal integration and frequency selectivity.


1979 ◽  
Vol 65 (S1) ◽  
pp. S55-S55 ◽  
Author(s):  
Marion F. Cohen ◽  
Linda P. Schubert

2004 ◽  
Author(s):  
Kosuke Sawa ◽  
Kenneth Leising ◽  
Aaron P. Blaisdell

1968 ◽  
Vol 73 (3, Pt.1) ◽  
pp. 268-272 ◽  
Author(s):  
Robert D. Hare

Sign in / Sign up

Export Citation Format

Share Document