scholarly journals Measuring lexical access during sentence processing

1980 ◽  
Vol 28 (1) ◽  
pp. 1-8 ◽  
Author(s):  
Michelle A. Blank
1978 ◽  
Vol 6 (6) ◽  
pp. 644-652 ◽  
Author(s):  
Michelle A. Blank ◽  
Donald J. Foss

2000 ◽  
Vol 23 (3) ◽  
pp. 333-334
Author(s):  
Frédéric Isel

Norris, McQueen & Cutler demonstrated that feedback is never necessary during lexical access and proposed a new autonomous model, that is, the Merge model, taking into account the known behavioral data on word recognition. For sentence processing, recent event-related brain potentials (ERPs) data suggest that interactions can occur but only after an initial autonomous stage of processing. Thus at this level too, there is no evidence in favor of feedback.


2019 ◽  
Vol 62 (2) ◽  
pp. 367-386 ◽  
Author(s):  
Amy Kemp ◽  
David Eddins ◽  
Rahul Shrivastav ◽  
Amanda Hampton Wray

Purpose Improving the ability to listen efficiently in noisy environments is a critical goal for hearing rehabilitation. However, understanding of the impact of difficult listening conditions on language processing is limited. The current study evaluated the neural processes underlying semantics in challenging listening conditions. Method Thirty adults with normal hearing completed an auditory sentence processing task in 4-talker babble. Event-related brain potentials were elicited by the final word in high- or low-context sentences, where the final word was either highly expected or not expected, followed by a 4-alternative forced-choice response with either longer (1,000 ms), middle (700 ms), or shorter (400 ms) response time deadlines (RTDs). Results Behavioral accuracy was reduced, and reactions times were faster for shorter RTDs. N400 amplitudes, reflecting ease of lexical access, were larger when elicited by target words in low-context sentences followed by shorter compared with longer RTDs. Conclusions These results reveal that more neural resources are allocated for semantic processing/lexical access when listening difficulty increases. Differences between RTDs may reflect increased attentional allocation for shorter RTDs. These findings suggest that situational listening demands can impact the demands for cognitive resources engaged in language processing, which could significantly impact listener experiences across environments.


2017 ◽  
Vol 25 (3) ◽  
pp. 1225 ◽  
Author(s):  
Renê Forster ◽  
Letícia Maria Sicuro Corrêa

This paper investigates the possibility of an effect of contextual information during the processing of sentences containing subject relative clauses (SRCs) and object relative clauses (ORCs) in Brazilian Portuguese. The predictions from one-stage models and from syntaxoriented approaches to sentence processing are outlined. An eye-tracking experiment is reported in which SRCs and ORC were presented when preceded by narrative contexts that could either favor a subject or an object relative clause analysis. The results suggest that ORCs are harder to process when compared to SRCs, no matter what discourse contexts they are inserted in. The contextual effect obtained here can be ascribed to a pre-syntactic priming, ie. a priming effect which arises during lexical access. The possibility of pre- and post-syntactic contextual effects in the processing of RCs is discussed.


2014 ◽  
Vol 4 (2) ◽  
pp. 167-191 ◽  
Author(s):  
A. Kate Miller

This study considers the role of lexical access in the activation and maintenance of referents interacting with syntactic computations during the online processing of wh-dependencies in second-language French by beginning (N = 39), low intermediate (N = 40), and high intermediate (N = 35) learners. Two computer-paced reading tasks involving concurrent picture classification were designed to investigate trace reactivation during sentence processing: The first task targeted sentences that contained indirect object relative clauses, whereas the second task involved indirect object cleft sentences. Response time profiles for sentences containing English-French cognates as antecedents were compared with those for sentences with noncognate vocabulary. All learner participants produced differing response patterns for cognate and noncognate items. Intermediate learners’ response patterns were consistent with trace reactivation for cognate items only; noncognate items induced inhibitions or erratic response patterns. Additionally, a (French-English bilingual) native speaker control group (N = 35) showed the predicted response pattern with the noncognate items only. These findings indicate that the role of lexical access in sentence processing merits further consideration.


Author(s):  
Susan A. Duffy ◽  
John M. Henderson ◽  
Robin K. Morris

1979 ◽  
Vol 7 (5) ◽  
pp. 346-353 ◽  
Author(s):  
Donald J. Foss ◽  
Randolph K. Cirilo ◽  
Michelle A. Blank

Author(s):  
Margreet Vogelzang ◽  
Christiane M. Thiel ◽  
Stephanie Rosemann ◽  
Jochem W. Rieger ◽  
Esther Ruigendijk

Purpose Adults with mild-to-moderate age-related hearing loss typically exhibit issues with speech understanding, but their processing of syntactically complex sentences is not well understood. We test the hypothesis that listeners with hearing loss' difficulties with comprehension and processing of syntactically complex sentences are due to the processing of degraded input interfering with the successful processing of complex sentences. Method We performed a neuroimaging study with a sentence comprehension task, varying sentence complexity (through subject–object order and verb–arguments order) and cognitive demands (presence or absence of a secondary task) within subjects. Groups of older subjects with hearing loss ( n = 20) and age-matched normal-hearing controls ( n = 20) were tested. Results The comprehension data show effects of syntactic complexity and hearing ability, with normal-hearing controls outperforming listeners with hearing loss, seemingly more so on syntactically complex sentences. The secondary task did not influence off-line comprehension. The imaging data show effects of group, sentence complexity, and task, with listeners with hearing loss showing decreased activation in typical speech processing areas, such as the inferior frontal gyrus and superior temporal gyrus. No interactions between group, sentence complexity, and task were found in the neuroimaging data. Conclusions The results suggest that listeners with hearing loss process speech differently from their normal-hearing peers, possibly due to the increased demands of processing degraded auditory input. Increased cognitive demands by means of a secondary visual shape processing task influence neural sentence processing, but no evidence was found that it does so in a different way for listeners with hearing loss and normal-hearing listeners.


Sign in / Sign up

Export Citation Format

Share Document