scholarly journals A model for the transduction stage of auditory speech processing

1987 ◽  
Vol 82 (S1) ◽  
pp. S83-S83 ◽  
Author(s):  
Stephanie Seneff
eLife ◽  
2016 ◽  
Vol 5 ◽  
Author(s):  
Hyojin Park ◽  
Christoph Kayser ◽  
Gregor Thut ◽  
Joachim Gross

During continuous speech, lip movements provide visual temporal signals that facilitate speech processing. Here, using MEG we directly investigated how these visual signals interact with rhythmic brain activity in participants listening to and seeing the speaker. First, we investigated coherence between oscillatory brain activity and speaker’s lip movements and demonstrated significant entrainment in visual cortex. We then used partial coherence to remove contributions of the coherent auditory speech signal from the lip-brain coherence. Comparing this synchronization between different attention conditions revealed that attending visual speech enhances the coherence between activity in visual cortex and the speaker’s lips. Further, we identified a significant partial coherence between left motor cortex and lip movements and this partial coherence directly predicted comprehension accuracy. Our results emphasize the importance of visually entrained and attention-modulated rhythmic brain activity for the enhancement of audiovisual speech processing.


2001 ◽  
Vol 52 (1-2) ◽  
pp. 69-78 ◽  
Author(s):  
B Mohr ◽  
S Heim ◽  
F Pulvermüller ◽  
B Rockstroh

Sign in / Sign up

Export Citation Format

Share Document