scholarly journals Eye Movements and Behavioural Responses to Gaze‐Contingent Expressive Faces in Typically Developing Infants and Infant Siblings

2020 ◽  
Author(s):  
Jolie R. Keemink ◽  
Lauren Jenner ◽  
Jonathan E. Prunty ◽  
Nicky Wood ◽  
David J. Kelly
2019 ◽  
Author(s):  
Mahsa Barzy ◽  
Heather Jane Ferguson ◽  
David Williams

Socio-communication is profoundly impaired among autistic individuals. Difficulties representing others’ mental states have been linked to modulations of gaze and speech, which have also been shown to be impaired in autism. Despite these observed impairments in ‘real-world’ communicative settings, research has mostly focused on lab-based experiments, where the language is highly structured. In a pre-registered experiment, we recorded eye movements and verbal responses while adults (N=50) engaged in a real-life conversation. Conversation topic either related to the self, a familiar other, or an unfamiliar other (e.g. "Tell me who is your/your mother’s/Marina’s favourite celebrity and why?”). Results replicated previous work, showing reduced attention to socially-relevant information among autistic participants (i.e. less time looking at the experimenter’s face, and more time looking around the background), compared to typically-developing controls. Importantly, perspective modulated social attention in both groups; talking about an unfamiliar other reduced attention to potentially distracting or resource demanding social information, and increased looks to non-social background. Social attention did not differ between self and familiar other contexts- reflecting greater shared knowledge for familiar/similar others. Autistic participants spent more time looking at the background when talking about an unfamiliar other vs. themselvesFuture research should investigate the cognitive mechanisms underlying this effect.


Author(s):  
Dzmitry A. Kaliukhovich ◽  
Nikolay V. Manyakov ◽  
Abigail Bangerter ◽  
Seth Ness ◽  
Andrew Skalkin ◽  
...  

Abstract Participants with autism spectrum disorder (ASD) (n = 121, mean [SD] age: 14.6 [8.0] years) and typically developing (TD) controls (n = 40, 16.4 [13.3] years) were presented with a series of videos representing biological motion on one side of a computer monitor screen and non-biological motion on the other, while their eye movements were recorded. As predicted, participants with ASD spent less overall time looking at presented stimuli than TD participants (P < 10–3) and showed less preference for biological motion (P < 10–5). Participants with ASD also had greater average latencies than TD participants of the first fixation on both biological (P < 0.01) and non-biological motion (P < 0.02). Findings suggest that individuals with ASD differ from TD individuals on multiple properties of eye movements and biological motion preference.


2019 ◽  
Vol 26 (6) ◽  
pp. 557-566 ◽  
Author(s):  
Nasrin Mohammadhasani ◽  
Tindara Caprì ◽  
Andrea Nucita ◽  
Giancarlo Iannizzotto ◽  
Rosa Angela Fabio

AbstractObjective:Several studies agree on the link between attention and eye movements during reading. It has been well established that attention and working memory (WM) interact. A question that could be addressed to better understand these relationships is: to what extent can an attention deficit affect eye movements and, consequently, remembering a word? The main aims of the present study were (1) to compare visual patterns of word stimuli between children with Attention Deficit Hyperactivity Disorder (ADHD) and typically developing (TD) children, during a visual task on word stimuli; (2) to examine the WM accuracy of the word stimuli; and (3) to compare the dynamic of visual scan path in both groups.Method:A total of 49 children with ADHD, age and sex matched with 32 TD children, were recruited. We used eye-tracking technology in which the Word Memory Test was implemented. To highlight the scan path of participants, two measures were used: the ordered direction of reading and the entropy index.Results:ADHD groups showed a poorer WM than TD group. They did not follow a typical scan path across the words compared with TD children, but their visual scanning was discontinuous, uncoordinated, and chaotic. ADHD groups showed an index of entropy among the four categories of saccades higher than TD group.Conclusions:The findings were discussed in light of two directions: the relationship between atypical visual scan path and WM and the training implications related to the necessity of redirecting the dynamic of visual scan path in ADHD to improve WM.


2019 ◽  
Vol 50 (2) ◽  
pp. 500-512
Author(s):  
Li Zhang ◽  
Guoli Yan ◽  
Li Zhou ◽  
Zebo Lan ◽  
Valerie Benson

Abstract The current study examined eye movement control in autistic (ASD) children. Simple targets were presented in isolation, or with central, parafoveal, or peripheral distractors synchronously. Sixteen children with ASD (47–81 months) and nineteen age and IQ matched typically developing children were instructed to look to the target as accurately and quickly as possible. Both groups showed high proportions (40%) of saccadic errors towards parafoveal and peripheral distractors. For correctly executed eye movements to the targets, centrally presented distractors produced the longest latencies (time taken to initiate eye movements), followed by parafoveal and peripheral distractor conditions. Central distractors had a greater effect in the ASD group, indicating evidence for potential atypical voluntary attentional control in ASD children.


2009 ◽  
Vol 6 (3) ◽  
pp. 375-378 ◽  
Author(s):  
Terje Falck-Ytter

Does a dysfunction in the mirror neuron system (MNS) underlie the social symptoms defining autism spectrum disorder (ASD)? Research suggests that the MNS matches observed actions to motor plans for similar actions, and that these motor plans include directions for predictive eye movements when observing goal-directed actions. Thus, one important question is whether children with ASD use predictive eye movements in action observation. Young children with ASD as well as typically developing children and adults were shown videos in which an actor performed object-directed actions (human agent condition). Children with ASD were also shown control videos showing objects moving by themselves (self-propelled condition). Gaze was measured using a corneal reflection technique. Children with ASD and typically developing individuals used strikingly similar goal-directed eye movements when observing others’ actions in the human agent condition. Gaze was reactive in the self-propelled condition, suggesting that prediction is linked to seeing a hand–object interaction. This study does not support the view that ASD is characterized by a global dysfunction in the MNS.


2013 ◽  
Vol 56 (2) ◽  
pp. 567-576 ◽  
Author(s):  
Danielle Droucker ◽  
Suzanne Curtin ◽  
Athena Vouloumanos

Purpose In this study, the authors aimed to examine whether biases for infant-directed (ID) speech and faces differ between infant siblings of children with autism spectrum disorder (ASD) (SIBS-A) and infant siblings of typically developing children (SIBS-TD), and whether speech and face biases predict language outcomes and risk group membership. Method Thirty-six infants were tested at ages 6, 8, 12, and 18 months. Infants heard 2 ID and 2 adult-directed (AD) speech passages paired with either a checkerboard or a face. The authors assessed expressive language at 12 and 18 months and general functioning at 12 months using the Mullen Scales of Early Learning (Mullen, 1995). Results Both infant groups preferred ID to AD speech and preferred faces to checkerboards. SIBS-TD demonstrated higher expressive language at 18 months than did SIBS-A, a finding that correlated with preferences for ID speech at 12 months. Although both groups looked longer to face stimuli than to the checkerboard, the magnitude of the preference was smaller in SIBS-A and predicted expressive vocabulary at 18 months in this group. Infants' preference for faces contributed to risk-group membership in a logistic regression analysis. Conclusion Infants at heightened risk of ASD differ from typically developing infants in their preferences for ID speech and faces, which may underlie deficits in later language development and social communication.


2013 ◽  
Vol 2013 ◽  
pp. 1-5 ◽  
Author(s):  
Maria Pia Bucci ◽  
Catherine Doyen ◽  
Yves Contenjean ◽  
Kelley Kaye

The aim of the study was to explore the effect of eye movements (saccades and pursuits) on postural stability in children with autism versus typically developing children of comparable age. Postural stability was recorded with a platform (Techno Concept) in seven children with autism (mean age: 6 ± 0.8) while fixating a target or making saccades or pursuit eye movements. Data was compared to that of seven age-matched typically developing children. Surface area and mean speed of the center of pressure (CoP) were measured. Autistic children (AC) were more instable than typically developing children (TD), both in simple as well as dual task conditions. Performing a dual task thus affects AC and TD children in a different way. AC stability is not improved during saccades or pursuit eye movements in the dual task condition; in contrast, saccades significantly improve postural stability in TD children. The postural instability observed in AC during simple as well as dual task supports the hypothesis that such children have deficits in cerebellar functions.


Autism ◽  
2020 ◽  
Vol 24 (8) ◽  
pp. 2153-2165
Author(s):  
Mahsa Barzy ◽  
Heather J Ferguson ◽  
David M Williams

Social-communication is profoundly impaired among autistic individuals. Difficulties representing others’ mental states have been linked to modulations of gaze and speech, which have also been shown to be impaired in autism. Despite these observed impairments in ‘real-world’ communicative settings, research has mostly focused on lab-based experiments, where the language is highly structured. In a pre-registered experiment, we recorded eye movements and verbal responses while adults ( N = 50) engaged in a real-life conversation. Using a novel approach, we also manipulated the perspective that participants adopted by asking them questions that were related to the self, a familiar other, or an unfamiliar other. Results replicated previous work, showing reduced attention to socially relevant information among autistic participants (i.e. less time looking at the experimenter’s face and more time looking around the background), compared to typically developing controls. Importantly, perspective modulated social attention in both groups; talking about an unfamiliar other reduced attention to potentially distracting or resource-demanding social information and increased looks to non-social background. Social attention did not differ between self and familiar other contexts, reflecting greater shared knowledge for familiar/similar others. Autistic participants spent more time looking at the background when talking about an unfamiliar other versus themselves. Future research should investigate the developmental trajectory of this effect and the cognitive mechanisms underlying it. Lay abstract Previous lab-based studies suggest that autistic individuals are less attentive to social aspects of their environment. In our study, we recorded the eye movements of autistic and typically developing adults while they engaged in a real-life social interaction with a partner. Results showed that autistic adults were less likely than typically developing adults to look at the experimenter’s face, and instead were more likely to look at the background. Moreover, the perspective that was adopted in the conversation (talking about self versus others) modulated the patterns of eye movements in autistic and non-autistic adults. Overall, people spent less time looking at their conversation partner’s eyes and face and more time looking at the background, when talking about an unfamiliar other compared to when talking about themselves. This pattern was magnified among autistic adults. We conclude that allocating attention to social information during conversation is cognitively effortful, but this can be mitigated when talking about a topic that is familiar to them.


2019 ◽  
Vol 72 (8) ◽  
pp. 1913-1925 ◽  
Author(s):  
Hassan Mansour ◽  
Gustav Kuhn

Experimental psychologists frequently present participants with social stimuli (videos or pictures) and measure behavioural responses. Such designs are problematic in that they remove the potential for social interaction and inadvertently restrict our eyes multifaceted nature as a tool to both perceive and communicate with others. The aim of this study was to develop a new paradigm within which we can easily and reliably measure the influence of top-down processes (belief), social activity (talking and listening), and possible clinical traits (gaze anxiety, and social interaction difficulties) onto gaze behaviours. Participants were engaged in a “real” or pre-recorded Skype conversation. Findings suggest that participants who believed they were engaging in a real conversation spent less time looking at the speaker’s eyes, but no differences were found for dwell time onto the whole face. Within our non-clinical sample, higher levels of gaze anxiety resulted in reduced dwell time onto the whole face but not eyes, whereas social interaction difficulties produced reduced dwell time onto the eyes only. Finally, talking consistently produced reduced dwell time onto the whole face and eyes regardless of any other conditions.


Sign in / Sign up

Export Citation Format

Share Document