Eye Movements in Judgements of Facial Expressions

Perception ◽  
1997 ◽  
Vol 26 (1_suppl) ◽  
pp. 297-297
Author(s):  
Y Osada ◽  
Y Nagasaka ◽  
R Yamazaki

We recorded eye movements by the method of corneal reflection while ten subjects viewed schematic faces drawn by lines. Each subject viewed different emotional faces: happy, angry, sad, disgusted, interested, frightened, and surprised. We measured the subject's judgements in terms of percentage ‘correct’ and reaction time. Schematic faces were composed of the face outline contours and of the brow, eyes, nose, and mouth which could all be modified to produce particular expressions. By masking parts of the face, we examined which features would have the greatest effects on judgements of emotion. Subjects always gave a saccade to the eyes and fixated even when the eyes were not important for the judgement. They also gave a saccade to the centre of the face and fixated it even when only the mouth was presented. The presentation of only the brow decreased the correct rate on the expression of ‘surprise’ but played an important role in the ‘sad’ judgement. The ‘angry’ judgement depended significantly on the brow and mouth. The eyes contributed greatly to the ‘disgusted’ judgement. These results suggest that the judgement of facial expressions of emotion can be strongly affected by each part of the schematic face. The concentration of saccades on the centre of the face suggests that the ‘configuration balance’ of the face is also likely to be important.

Perception ◽  
2021 ◽  
pp. 030100662110270
Author(s):  
Kennon M. Sheldon ◽  
Ryan Goffredi ◽  
Mike Corcoran

Facial expressions of emotion have important communicative functions. It is likely that mask-wearing during pandemics disrupts these functions, especially for expressions defined by activity in the lower half of the face. We tested this by asking participants to rate both Duchenne smiles (DSs; defined by the mouth and eyes) and non-Duchenne or “social” smiles (SSs; defined by the mouth alone), within masked and unmasked target faces. As hypothesized, masked SSs were rated much lower in “a pleasant social smile” and much higher in “a merely neutral expression,” compared with unmasked SSs. Essentially, masked SSs became nonsmiles. Masked DSs were still rated as very happy and pleasant, although significantly less so than unmasked DSs. Masked DSs and SSs were both rated as displaying more disgust than the unmasked versions.


PLoS ONE ◽  
2021 ◽  
Vol 16 (1) ◽  
pp. e0245777
Author(s):  
Fanny Poncet ◽  
Robert Soussignan ◽  
Margaux Jaffiol ◽  
Baptiste Gaudelus ◽  
Arnaud Leleu ◽  
...  

Recognizing facial expressions of emotions is a fundamental ability for adaptation to the social environment. To date, it remains unclear whether the spatial distribution of eye movements predicts accurate recognition or, on the contrary, confusion in the recognition of facial emotions. In the present study, we asked participants to recognize facial emotions while monitoring their gaze behavior using eye-tracking technology. In Experiment 1a, 40 participants (20 women) performed a classic facial emotion recognition task with a 5-choice procedure (anger, disgust, fear, happiness, sadness). In Experiment 1b, a second group of 40 participants (20 women) was exposed to the same materials and procedure except that they were instructed to say whether (i.e., Yes/No response) the face expressed a specific emotion (e.g., anger), with the five emotion categories tested in distinct blocks. In Experiment 2, two groups of 32 participants performed the same task as in Experiment 1a while exposed to partial facial expressions composed of actions units (AUs) present or absent in some parts of the face (top, middle, or bottom). The coding of the AUs produced by the models showed complex facial configurations for most emotional expressions, with several AUs in common. Eye-tracking data indicated that relevant facial actions were actively gazed at by the decoders during both accurate recognition and errors. False recognition was mainly associated with the additional visual exploration of less relevant facial actions in regions containing ambiguous AUs or AUs relevant to other emotional expressions. Finally, the recognition of facial emotions from partial expressions showed that no single facial actions were necessary to effectively communicate an emotional state. In contrast, the recognition of facial emotions relied on the integration of a complex set of facial cues.


2018 ◽  
Vol 71 (7) ◽  
pp. 1512-1525
Author(s):  
Fatima M Felisberti

This study investigated whether adults’ ability to attribute emotions to brief facial expressions (microexpressions) is associated with family-related environmental factors (FrFs) such as one’s number of siblings (Experiment 1), attachment style (Experiment 2), or perceived parental authority style (Experiment 3). Participants’ accuracy and reaction time (RT) to the recognition of anger, contempt, disgust, fear, happiness, and sadness to facial microexpressions (exposure: 100 ms) were measured with a six-alternative forced choice computerised method (6AFC). The attachment style and the authority style of the participants’ parents were accessed using questionnaires. The findings revealed that up to 13% of the variance in participants’ responses could be explained by FrFs, with modest to moderate effect sizes. Microexpressions linked to signs of hostility or threat (i.e., contempt and fear) were decoded faster and/or more accurately by adults with few or no siblings or with a fearful attachment. Conversely, participants who recalled their fathers as authoritarian were worse at recognising contempt and fear than participants who perceived them as permissive or authoritative. The findings suggest that early FrFs may still be involved in the fine-tuning of responses to signs of contextual danger when the time for cognitive processing of facial expressions is severely restricted.


2021 ◽  
Author(s):  
Louisa Kulke ◽  
Lena Brümmer ◽  
Arezoo Pooresmaeili ◽  
Annekathrin Schacht

In everyday life, faces with emotional expressions quickly attract attention and eye-movements. To study the neural mechanisms of such emotion-driven attention by means of event-related brain potentials (ERPs), tasks that employ covert shifts of attention are commonly used, in which participants need to inhibit natural eye-movements towards stimuli. It remains, however, unclear how shifts of attention to emotional faces with and without eye-movements differ from each other. The current preregistered study aimed to investigate neural differences between covert and overt emotion-driven attention. We combined eye-tracking with measurements of ERPs to compare shifts of attention to faces with happy, angry or neutral expressions when eye-movements were either executed (Go conditions) or withheld (No-go conditions). Happy and angry faces led to larger EPN amplitudes, shorter latencies of the P1 component and faster saccades, suggesting that emotional expressions significantly affected shifts of attention. Several ERPs (N170, EPN, LPC), were augmented in amplitude when attention was shifted with an eye-movement, indicating an enhanced neural processing of faces if eye-movements had to be executed together with a reallocation of attention. However, the modulation of ERPs by facial expressions did not differ between the Go and No-go conditions, suggesting that emotional content enhances both covert and overt shifts of attention. In summary, our results indicate that overt and covert attention shifts differ but are comparably affected by emotional content.


Author(s):  
Maida Koso-Drljević ◽  
Meri Miličević

The aim of the study was to test two assumptions about the lateralization of the processing of emotional facial expressions: the assumption of right hemisphere dominance and the valence assumption and to egsamine the influence of gender of the presented stimulus (chimera) and depression as an emotional state of participants. The sample consisted of 83 female students, with an average age of 20 years. Participants solved the Task of Recognizing Emotional Facial Expressions on a computer and then completed the DASS-21, Depression subscale. The results of the study partially confirmed the assumption of valence for the dependent variable - the accuracy of the response. Participants were recognizing more accurately the emotion of sadness than happiness when it is presented on the left side of the face, which is consistent with the valence hypothesis, according to which the right hemisphere is responsible for recognizing negative emotions. However, when it comes to the right side of the face, participants were equally accurately recognizing the emotion of sadness and happiness, which is not consistent with the valence hypothesis. The main effect of the gender of the chimera was statistically significant for the accuracy of the response, the recognition accuracy was higher for the male chimeras compared to the female. A statistically significant negative correlation was obtained between the variable sides of the face (left and right) with the achieved result on the depression subscale for the dependent variable - reaction time. The higher the score on the depressive subscale, the slower (longer) is reaction time to the presented chimera, both on the left and on the right.


2016 ◽  
Vol 33 (S1) ◽  
pp. S370-S371
Author(s):  
M. Rocha ◽  
S. Soares ◽  
S. Silva ◽  
N. Madeira ◽  
C. Silva

IntroductionAlexithymia is a multifactorial personality trait observed in several mental disorders, especially those with poor social functioning. Although it has been proposed that difficulties in interpersonal interactions in highly alexithymic individuals may stem from their reduced ability to express and recognize facial expressions, this still remains controversial.AimIn everyday life, faces displaying emotions are dynamic, although most studies have relied on static stimuli. The aim of this study was to investigate whether individuals with high levels of alexithymia differed from a control group in the categorization of emotional faces presented in a dynamic way. Given the highly dynamic nature of facial displays in real life, we used morphed videos depicting faces varying 1% from neutral to angry, disgust or happy faces, with a video presentation of 35 seconds.MethodSixty participants (27 males and 33 females) were divided into high (HA) and low levels of alexithymia (LA) by using the Toronto Alexithymia Scale (TAS-20). Participants were instructed to watch the face change from neutral to an emotion and to press a keyboard as soon as they could categorize an emotion expressed in the face.ResultsThe results revealed an interaction between alexithymia and emotion showing that HA, compared to LA, were more inaccurate at categorizing angry faces.Disclosure of interestThe authors have not supplied their declaration of competing interest.


2018 ◽  
Vol 122 (4) ◽  
pp. 1432-1448 ◽  
Author(s):  
Charlott Maria Bodenschatz ◽  
Anette Kersting ◽  
Thomas Suslow

Orientation of gaze toward specific regions of the face such as the eyes or the mouth helps to correctly identify the underlying emotion. The present eye-tracking study investigates whether facial features diagnostic of specific emotional facial expressions are processed preferentially, even when presented outside of subjective awareness. Eye movements of 73 healthy individuals were recorded while completing an affective priming task. Primes (pictures of happy, neutral, sad, angry, and fearful facial expressions) were presented for 50 ms with forward and backward masking. Participants had to evaluate subsequently presented neutral faces. Results of an awareness check indicated that participants were subjectively unaware of the emotional primes. No affective priming effects were observed but briefly presented emotional facial expressions elicited early eye movements toward diagnostic regions of the face. Participants oriented their gaze more rapidly to the eye region of the neutral mask after a fearful facial expression. After a happy facial expression, participants oriented their gaze more rapidly to the mouth region of the neutral mask. Moreover, participants dwelled longest on the eye region after a fearful facial expression, and the dwell time on the mouth region was longest for happy facial expressions. Our findings support the idea that briefly presented fearful and happy facial expressions trigger an automatic mechanism that is sensitive to the distribution of relevant facial features and facilitates the orientation of gaze toward them.


1998 ◽  
Vol 9 (4) ◽  
pp. 270-276 ◽  
Author(s):  
Kari Edwards

Results of studies reported here indicate that humans are attuned to temporal cues in facial expressions of emotion. The experimental task required subjects to reproduce the actual progression of a target person's spontaneous expression (i.e., onset to offset) from a scrambled set of photographs. Each photograph depicted a segment of the expression that corresponded to approximately 67 ms in real time. Results of two experiments indicated that (a) individuals could detect extremely subtle dynamic cues in a facial expression and could utilize these cues to reproduce the proper temporal progression of the display at above-chance levels of accuracy; (b) women performed significantly better than men on the task designed to assess this ability; (c) individuals were most sensitive to the temporal characteristics of the early stages of an expression; and (d) accuracy was inversely related to the amount of time allotted for the task. The latter finding may reflect the relative involvement of (error-prone) cognitively mediated or strategic processes in what is normally a relatively automatic, nonconscious process.


2004 ◽  
Vol 15 (1-2) ◽  
pp. 23-34 ◽  
Author(s):  
Manas K. Mandal ◽  
Nalini Ambady

Recent research indicates that (a) the perception and expression of facial emotion are lateralized to a great extent in the right hemisphere, and, (b) whereas facial expressions of emotion embody universal signals, culture-specific learning moderates the expression and interpretation of these emotions. In the present article, we review the literature on laterality and universality, and propose that, although some components of facial expressions of emotion are governed biologically, others are culturally influenced. We suggest that the left side of the face is more expressive of emotions, is more uninhibited, and displays culture-specific emotional norms. The right side of face, on the other hand, is less susceptible to cultural display norms and exhibits more universal emotional signals.


2017 ◽  
Vol 41 (S1) ◽  
pp. S109-S110 ◽  
Author(s):  
F.L. Osório ◽  
A.D. Sabino ◽  
C.M. Camargo

IntroductionProper recognition of facial expressions of emotion is crucial for human social relationships. Impairments in the capacity to process facial information may play an important role in the etiology and maintenance of certain mental disorders, especially music performance anxiety (MPA).ObjectiveTo assess the recognition of facial expressions of emotion in musicians compared to a group of subjects from the general population, considering also the presence/absence of MPA.MethodsHundred and fifty amateur and/or professional musicians who regularly take part in public performances (GM) and 150 subjects from the general population (GP) completed a task of facial emotion recognition and were assessed in terms of accuracy and reaction time. The group of musicians was subdivided between subjects with and without MPA indicators. Data were analyzed using Student's t test (P < 0.05) within the statistical package for the social sciences.ResultsGM were less accurate and had a longer reaction time in the recognition of facial happiness (P < 0.001, effect size: 0.25–0.44) compared to GP. Musicians with MPA had a still lower accuracy in the recognition of happiness, as well as longer reaction times for emotions as a whole (P < 0.04; effect size: 0.32–0.40) compared to musicians without MPA.ConclusionThe poorer performance of musicians in the recognition of happiness suggests difficulties to recognize indicators of social approval, which may negatively affect performance through increased anxiety and negative thoughts that can favor the onset of MPA.Disclosure of interestThe authors have not supplied their declaration of competing interest.


Sign in / Sign up

Export Citation Format

Share Document