Implicit learning of non‐native speech stimuli.

2009 ◽  
Vol 125 (4) ◽  
pp. 2763-2763 ◽  
Author(s):  
Eleni L. Vlahou ◽  
Athanassios Protopapas ◽  
Aaron Seitz
2016 ◽  
Vol 139 (5) ◽  
pp. EL161-EL166
Author(s):  
Shuting Huo ◽  
Sha Tao ◽  
Wenjing Wang ◽  
Mingshuang Li ◽  
Qi Dong ◽  
...  

2001 ◽  
Vol 44 (6) ◽  
pp. 1189-1200 ◽  
Author(s):  
Akiko Hayashi ◽  
Yuji Tamekawa ◽  
Shigeru Kiritani

The developmental change in auditory preferences for speech stimuli was investigated for Japanese infants aged 4–14 months old. We conducted three experiments using two speech pairs in the head-turn preference procedure. Infant-directed (ID) speech and adult-directed (AD) speech stimuli were used in a longitudinal study (Experiment 1) and a cross-sectional study (Experiment 2). Native (Japanese) and non-native (English) speech stimuli were used in a cross-sectional study (Experiment 3). In all experiments, infants demonstrated a developmental change in their listening preference. For the ID/AD speech pair used in Experiments 1 and 2, infants show a U-shaped developmental shift with three developmental stages: Stage 1, in which very young infants tend to prefer ID speech over AD speech; Stage 2, in which the preference for ID speech decreases temporarily; and Stage 3, in which older infants again show a consistent preference for ID speech. For the native/non-native speech pair, there is a tendency for an increased preference for native speech over non-native speech, although infants did not demonstrate a U-shaped pattern. The difference in developmental pattern between the two types of speech pairs was discussed.


1998 ◽  
Vol 41 (3) ◽  
pp. 538-548 ◽  
Author(s):  
Sean C. Huckins ◽  
Christopher W. Turner ◽  
Karen A. Doherty ◽  
Michael M. Fonte ◽  
Nikolaus M. Szeverenyi

Functional Magnetic Resonance Imaging (fMRI) holds exciting potential as a research and clinical tool for exploring the human auditory system. This noninvasive technique allows the measurement of discrete changes in cerebral cortical blood flow in response to sensory stimuli, allowing determination of precise neuroanatomical locations of the underlying brain parenchymal activity. Application of fMRI in auditory research, however, has been limited. One problem is that fMRI utilizing echo-planar imaging technology (EPI) generates intense noise that could potentially affect the results of auditory experiments. Also, issues relating to the reliability of fMRI for listeners with normal hearing need to be resolved before this technique can be used to study listeners with hearing loss. This preliminary study examines the feasibility of using fMRI in auditory research by performing a simple set of experiments to test the reliability of scanning parameters that use a high resolution and high signal-to-noise ratio unlike that presently reported in the literature. We used consonant-vowel (CV) speech stimuli to investigate whether or not we could observe reproducible and consistent changes in cortical blood flow in listeners during a single scanning session, across more than one scanning session, and in more than one listener. In addition, we wanted to determine if there were differences between CV speech and nonspeech complex stimuli across listeners. Our study shows reproducibility within and across listeners for CV speech stimuli. Results were reproducible for CV speech stimuli within fMRI scanning sessions for 5 out of 9 listeners and were reproducible for 6 out of 8 listeners across fMRI scanning sessions. Results of nonspeech complex stimuli across listeners showed activity in 4 out of 9 individuals tested.


2010 ◽  
Vol 24 (2) ◽  
pp. 91-101 ◽  
Author(s):  
Juliana Yordanova ◽  
Rolf Verleger ◽  
Ullrich Wagner ◽  
Vasil Kolev

The objective of the present study was to evaluate patterns of implicit processing in a task where the acquisition of explicit and implicit knowledge occurs simultaneously. The number reduction task (NRT) was used as having two levels of organization, overt and covert, where the covert level of processing is associated with implicit associative and implicit procedural learning. One aim was to compare these two types of implicit processes in the NRT when sleep was or was not introduced between initial formation of task representations and subsequent NRT processing. To assess the effects of different sleep stages, two sleep groups (early- and late-night groups) were used where initial training of the task was separated from subsequent retest by 3 h full of predominantly slow wave sleep (SWS) or rapid eye movement (REM) sleep. In two no-sleep groups, no interval was introduced between initial and subsequent NRT performance. A second aim was to evaluate the interaction between procedural and associative implicit learning in the NRT. Implicit associative learning was measured by the difference between the speed of responses that could or could not be predicted by the covert abstract regularity of the task. Implicit procedural on-line learning was measured by the practice-based increased speed of performance with time on task. Major results indicated that late-night sleep produced a substantial facilitation of implicit associations without modifying individual ability for explicit knowledge generation or for procedural on-line learning. This was evidenced by the higher rate of subjects who gained implicit knowledge of abstract task structure in the late-night group relative to the early-night and no-sleep groups. Independently of sleep, gain of implicit associative knowledge was accompanied by a relative slowing of responses to unpredictable items suggesting reciprocal interactions between associative and motor procedural processes within the implicit system. These observations provide evidence for the separability and interactions of different patterns of processing within implicit memory.


2004 ◽  
Vol 49 (6) ◽  
pp. 717-719
Author(s):  
Carol A. Seger
Keyword(s):  

2003 ◽  
Author(s):  
Ivan K. Ash ◽  
Timothy J. Nokes
Keyword(s):  

2011 ◽  
Author(s):  
R. Sterczyoski ◽  
M. Roczniewska ◽  
A. Poplawska
Keyword(s):  

Sign in / Sign up

Export Citation Format

Share Document