Attentional orienting to social and nonsocial cues in early deaf adults.

2015 ◽  
Vol 41 (6) ◽  
pp. 1758-1771 ◽  
Author(s):  
Benedetta Heimler ◽  
Wieske van Zoest ◽  
Francesca Baruffaldi ◽  
Pasquale Rinaldi ◽  
Maria Cristina Caselli ◽  
...  
2008 ◽  
Vol 20 (5) ◽  
pp. 879-891 ◽  
Author(s):  
Christine M. Tipper ◽  
Todd C. Handy ◽  
Barry Giesbrecht ◽  
Alan Kingstone

This study examines whether orienting attention to biologically based social cues engages neural mechanisms distinct from those engaged by orienting to nonbiologically based nonsocial cues. Participants viewed a perceptually ambiguous stimulus presented centrally while performing a target detection task. By having participants alternate between viewing this stimulus as an eye in profile or an arrowhead, we were able to directly compare the neural mechanisms of attentional orienting to social and nonsocial cues while holding the physical stimulus constant. The functional magnetic resonance imaging results indicated that attentional orienting to both eye gaze and arrow cues engaged extensive dorsal and ventral fronto-parietal networks. Eye gaze cues, however, more vigorously engaged two regions in the ventral frontal cortex associated with attentional reorienting to salient or meaningful stimuli, as well as lateral occipital regions. An event-related potential study demonstrated that this enhanced occipital response was attributable to a higher-amplitude sensory gain effect for targets appearing at locations cued by eye gaze than for those cued by an arrowhead. These results endorse the hypothesis that differences in attention to social and nonsocial cues are quantitative rather than qualitative, running counter to current models that assume enhanced processing for social stimuli reflects the involvement of a unique network of brain regions. An intriguing implication of the present study is the possibility that our ability to orient volitionally and reflexively to socially irrelevant stimuli, including arrowheads, may have arisen as a useful by-product of a system that developed first, and foremost, to promote social orienting to stimuli that are biologically relevant.


2019 ◽  
Vol 45 (Supplement_2) ◽  
pp. S249-S250
Author(s):  
Lauren Catalano ◽  
Warren Szewczyk ◽  
Sydney Ghazarian ◽  
James Lopez ◽  
Michael Greeen ◽  
...  

2020 ◽  
Vol 63 (7) ◽  
pp. 2245-2254 ◽  
Author(s):  
Jianrong Wang ◽  
Yumeng Zhu ◽  
Yu Chen ◽  
Abdilbar Mamat ◽  
Mei Yu ◽  
...  

Purpose The primary purpose of this study was to explore the audiovisual speech perception strategies.80.23.47 adopted by normal-hearing and deaf people in processing familiar and unfamiliar languages. Our primary hypothesis was that they would adopt different perception strategies due to different sensory experiences at an early age, limitations of the physical device, and the developmental gap of language, and others. Method Thirty normal-hearing adults and 33 prelingually deaf adults participated in the study. They were asked to perform judgment and listening tasks while watching videos of a Uygur–Mandarin bilingual speaker in a familiar language (Standard Chinese) or an unfamiliar language (Modern Uygur) while their eye movements were recorded by eye-tracking technology. Results Task had a slight influence on the distribution of selective attention, whereas subject and language had significant influences. To be specific, the normal-hearing and the d10eaf participants mainly gazed at the speaker's eyes and mouth, respectively, in the experiment; moreover, while the normal-hearing participants had to stare longer at the speaker's mouth when they confronted with the unfamiliar language Modern Uygur, the deaf participant did not change their attention allocation pattern when perceiving the two languages. Conclusions Normal-hearing and deaf adults adopt different audiovisual speech perception strategies: Normal-hearing adults mainly look at the eyes, and deaf adults mainly look at the mouth. Additionally, language and task can also modulate the speech perception strategy.


2017 ◽  
Vol 76 (2) ◽  
pp. 71-79 ◽  
Author(s):  
Hélène Maire ◽  
Renaud Brochard ◽  
Jean-Luc Kop ◽  
Vivien Dioux ◽  
Daniel Zagar

Abstract. This study measured the effect of emotional states on lexical decision task performance and investigated which underlying components (physiological, attentional orienting, executive, lexical, and/or strategic) are affected. We did this by assessing participants’ performance on a lexical decision task, which they completed before and after an emotional state induction task. The sequence effect, usually produced when participants repeat a task, was significantly smaller in participants who had received one of the three emotion inductions (happiness, sadness, embarrassment) than in control group participants (neutral induction). Using the diffusion model ( Ratcliff, 1978 ) to resolve the data into meaningful parameters that correspond to specific psychological components, we found that emotion induction only modulated the parameter reflecting the physiological and/or attentional orienting components, whereas the executive, lexical, and strategic components were not altered. These results suggest that emotional states have an impact on the low-level mechanisms underlying mental chronometric tasks.


2008 ◽  
Author(s):  
Kaitlin Laidlaw ◽  
Sara Stevens ◽  
Jim McAuliffe ◽  
Jay Pratt

1994 ◽  
Author(s):  
Marcia Grabowecky ◽  
Lynn C. Robertson ◽  
Anne Treisman

2013 ◽  
Author(s):  
Marcus N. Morrisey ◽  
M. D. Rutherford ◽  
Catherine L. Reed ◽  
Daniel N. McIntosh

Sign in / Sign up

Export Citation Format

Share Document