scholarly journals Pigeons’ short-term memory for temporal and visual stimuli in delayed matching-to-sample

1990 ◽  
Vol 18 (1) ◽  
pp. 23-28 ◽  
Author(s):  
Robin L. Bowers ◽  
Ralph W. Richards
1999 ◽  
Vol 16 (3) ◽  
pp. 449-459 ◽  
Author(s):  
CATHERINE TALLON-BAUDRY ◽  
ANDREAS KREITER ◽  
OLIVIER BERTRAND

In a visual delayed matching-to-sample task, compared to a control condition, we had previously identified different components of the human EEG that could reflect the rehearsal of an object representation in short-term memory (Tallon-Baudry et al., 1998). These components were induced oscillatory activities in the gamma (24–60 Hz) and beta (15–20 Hz) bands, peaking during the delay at occipital and frontal electrodes, and two negativities in the evoked potentials. Sustained activities (lasting until the end of the delay) are more likely to reflect the continuous rehearsing process in memory than transient (ending before the end of the delay) activities. Nevertheless, since the delay duration we used in our previous experiment was fixed and rather short, it was difficult to discriminate between sustained and transient components. Here we used the same delayed matching-to-sample task, but with variable delay durations. The same oscillatory components in the gamma and beta bands were observed again during the delay. The only components that showed a sustained time course compatible with a memory rehearsing process were the occipital gamma and frontal beta induced activities. These two activities slowly decreased with increasing delay duration, while the performance of the subjects decreased in parallel. No sustained response could be found in the evoked potentials. These results support the hypothesis that objects representations in visual short-term memory consist of oscillating synchronized cell assemblies.


1975 ◽  
Vol 37 (1) ◽  
pp. 203-207 ◽  
Author(s):  
David M. Grilly

The short-term memory for visual stimuli was tested in 17 chimpanzees, 7 females and 10 males, with the delayed matching-to-sample technique. A statistically significant superiority of females in matching accuracy was exhibited over an extended period of time and under at least two different retention intervals. The results were consistent with those obtained with rhesus monkeys on similar tasks. The possibility that this difference was attributable to factors other than experience was suggested.


2021 ◽  
Vol 11 (9) ◽  
pp. 1206
Author(s):  
Erika Almadori ◽  
Serena Mastroberardino ◽  
Fabiano Botta ◽  
Riccardo Brunetti ◽  
Juan Lupiáñez ◽  
...  

Object sounds can enhance the attentional selection and perceptual processing of semantically-related visual stimuli. However, it is currently unknown whether crossmodal semantic congruence also affects the post-perceptual stages of information processing, such as short-term memory (STM), and whether this effect is modulated by the object consistency with the background visual scene. In two experiments, participants viewed everyday visual scenes for 500 ms while listening to an object sound, which could either be semantically related to the object that served as the STM target at retrieval or not. This defined crossmodal semantically cued vs. uncued targets. The target was either in- or out-of-context with respect to the background visual scene. After a maintenance period of 2000 ms, the target was presented in isolation against a neutral background, in either the same or different spatial position as in the original scene. The participants judged the same vs. different position of the object and then provided a confidence judgment concerning the certainty of their response. The results revealed greater accuracy when judging the spatial position of targets paired with a semantically congruent object sound at encoding. This crossmodal facilitatory effect was modulated by whether the target object was in- or out-of-context with respect to the background scene, with out-of-context targets reducing the facilitatory effect of object sounds. Overall, these findings suggest that the presence of the object sound at encoding facilitated the selection and processing of the semantically related visual stimuli, but this effect depends on the semantic configuration of the visual scene.


2008 ◽  
Vol 28 (1) ◽  
pp. 99-99 ◽  
Author(s):  
R. Sapkota ◽  
S. Pardhan ◽  
A. Tavassoli ◽  
I. Van Der Linde

Sign in / Sign up

Export Citation Format

Share Document