scholarly journals Associations between Cognitive Concepts of Self and Emotional Facial Expressions with an Emphasis on Emotion Awareness

Psych ◽  
2021 ◽  
Vol 3 (2) ◽  
pp. 48-60
Author(s):  
Peter Walla ◽  
Aimee Mavratzakis

Recognising our own and others’ emotions is vital for healthy social development. The aim of the current study was to determine how emotions related to the self or to another influence behavioural expressions of emotion. Facial electromyography (EMG) was used to record spontaneous facial muscle activity in nineteen participants while they passively viewed negative, positive and neutral emotional pictures during three blocks of referential instructions. Each participant imagined themself, another person or no one experiencing the emotional scenario, with the priming words “You”, “Him” or “None” presented before each picture for the respective block of instructions. Emotion awareness (EA) was also recorded using the TAS-20 alexithymia questionnaire. Corrugator supercilii (cs) muscle activity increased significantly between 500 and 1000 ms post stimulus onset during negative and neutral picture presentations, regardless of ownership. Independent of emotion, cs activity was greatest during the “no one” task and lowest during the “self” task from less than 250 to 1000 ms. Interestingly, the degree of cs activation during referential tasks was further modulated by EA. Low EA corresponded to significantly stronger cs activity overall compared with high EA, and this effect was even more pronounced during the “no one” task. The findings suggest that cognitive processes related to the perception of emotion ownership can influence spontaneous facial muscle activity, but that a greater degree of integration between higher cognitive and lower affective levels of information may interrupt or suppress these behavioural expressions of emotion.

2017 ◽  
Author(s):  
Peter Robert Cannon ◽  
Bei Li ◽  
John M. Grigor

Hedonic responses to foods are often measured using subjective liking ratings scales. This is problematic because food behaviours are complex and single measurements points that occur after tasting are unable to capture an individual’s dynamic affective state. To address this limitation, techniques have been developed to sample subjective affective responses during oral processing, such as temporal dominance of emotion. These methods are also limited because they interrupt natural behaviours associated with food and oral processing. The present research investigates the potential use of electromyography as a means to predict subjective liking ratings using affective facial muscle activity recorded at different phases of oral processing while tasting liquids. Using linear mixed models, muscle activity recorded while emptying into the mouth, swirling, and thinking about the taste of bitter and sweet liquid solutions was used to predict subjective liking ratings. During different phases of the tasting, these mixed models demonstrate that zygomaticus major activity predicted increased liking and that corrugator supercilii and levator labii superioris predicted decreased liking. The change in liking ratings predicted by each muscle varied depending on whether participants were emptying, swirling, or thinking about the taste. We conclude that facial muscle activity is a valuable measure of affective responses during dynamic food behaviours.


2020 ◽  
Vol 10 (1) ◽  
Author(s):  
Chun-Ting Hsu ◽  
Wataru Sato ◽  
Sakiko Yoshikawa

Abstract Facial expression is an integral aspect of non-verbal communication of affective information. Earlier psychological studies have reported that the presentation of prerecorded photographs or videos of emotional facial expressions automatically elicits divergent responses, such as emotions and facial mimicry. However, such highly controlled experimental procedures may lack the vividness of real-life social interactions. This study incorporated a live image relay system that delivered models’ real-time performance of positive (smiling) and negative (frowning) dynamic facial expressions or their prerecorded videos to participants. We measured subjective ratings of valence and arousal and facial electromyography (EMG) activity in the zygomaticus major and corrugator supercilii muscles. Subjective ratings showed that the live facial expressions were rated to elicit higher valence and more arousing than the corresponding videos for positive emotion conditions. Facial EMG data showed that compared with the video, live facial expressions more effectively elicited facial muscular activity congruent with the models’ positive facial expressions. The findings indicate that emotional facial expressions in live social interactions are more evocative of emotional reactions and facial mimicry than earlier experimental data have suggested.


2014 ◽  
pp. 7-23
Author(s):  
Michela Balconi ◽  
Giovanni Lecci ◽  
Verdiana Trapletti

The present paper explored the relationship between emotional facial response and electromyographic modulation in children when they observe facial expression of emotions. Facial responsiveness (evaluated by arousal and valence ratings) and psychophysiological correlates (facial electromyography, EMG) were analyzed when children looked at six facial expressions of emotions (happiness, anger, fear, sadness, surprise and disgust). About EMG measure, corrugator and zygomatic muscle activity was monitored in response to different emotional types. ANOVAs showed differences for both EMG and facial response across the subjects, as a function of different emotions. Specifically, some emotions were well expressed by all the subjects (such as happiness, anger and fear) in terms of high arousal, whereas some others were less level arousal (such as sadness). Zygomatic activity was increased mainly for happiness, from one hand, corrugator activity was increased mainly for anger, fear and surprise, from the other hand. More generally, EMG and facial behavior were highly correlated each other, showing a "mirror" effect with respect of the observed faces.


2018 ◽  
Author(s):  
Louisa Kulke ◽  
Dennis Feyerabend ◽  
Annekathrin Schacht

Human faces express emotions, informing others about their affective states. In order to measure expressions of emotion, facial Electromyography (EMG) has widely been used, requiring electrodes and technical equipment. More recently, emotion recognition software has been developed that detects emotions from video recordings of human faces. However, its validity and comparability to EMG measures is unclear. The aim of the current study was to compare the Affectiva Affdex emotion recognition software by iMotions with EMG measurements of the zygomaticus mayor and corrugator supercilii muscle, concerning its ability to identify happy, angry and neutral faces. Twenty participants imitated these facial expressions while videos and EMG were recorded. Happy and angry expressions were detected by both the software and by EMG above chance, while neutral expressions were more often falsely identified as negative by EMG compared to the software. Overall, EMG and software values correlated highly. In conclusion, Affectiva Affdex software can identify emotions and its results are comparable to EMG findings.


Author(s):  
Xia Fang ◽  
Disa Sauter ◽  
Marc Heerdink ◽  
Gerben van Kleef

There is a growing consensus that culture influences the perception of facial expressions of emotion. However, little is known about whether and how culture shapes the production of emotional facial expressions, and even less so about whether culture differentially shapes the production of posed versus spontaneous expressions. Drawing on prior work on cultural differences in emotional communication, we tested the prediction that people from the Netherlands (a historically heterogeneous culture where people are prone to low-context communication) produce facial expressions that are more distinct across emotions compared to people from China (a historically homogeneous culture where people are prone to high-context communication). Furthermore, we examined whether the degree of distinctiveness varies across posed and spontaneous expressions. Dutch and Chinese participants were instructed to either pose facial expressions of anger and disgust, or to share autobiographical events that elicited spontaneous expressions of anger or disgust. Using the complementary approaches of supervised machine learning and information-theoretic analysis of facial muscle movements, we show that posed and spontaneous facial expressions of anger and disgust were more distinct when produced by Dutch compared to Chinese participants. These findings shed new light on the role of culture in emotional communication by demonstrating, for the first time, effects on the distinctiveness of production of facial expressions.


2019 ◽  
Vol 9 (1) ◽  
Author(s):  
Alexandre C. Fernandes ◽  
Teresa Garcia-Marques

AbstractTime perception relies on the motor system. Involves core brain regions of this system, including those associated with feelings generated from sensorimotor states. Perceptual timing is also distorted when movement occurs during timing tasks, possibly by interfering with sensorimotor afferent feedback. However, it is unknown if the perception of time is an active process associated with specific patterns of muscle activity. We explored this idea based on the phenomenon of electromyographic gradients, which consists of the dynamic increase of muscle activity during cognitive tasks that require sustained attention, a critical function in perceptual timing. We aimed to determine whether facial muscle dynamic activity indexes the subjective representation of time. We asked participants to judge stimuli durations (varying in familiarity) while we monitored the time course of the activity of the zygomaticus-major and corrugator-supercilii muscles, both associated with cognitive and affective feelings. The dynamic electromyographic activity in corrugator-supercilii over time reflected objective time and this relationship predicted subjective judgments of duration. Furthermore, the zygomaticus-major muscle signaled the bias that familiarity introduces in duration judgments. This suggests that subjective duration could be an embodiment process based in motor information changing over time and their associated feelings.


2013 ◽  
Vol 16 ◽  
Author(s):  
Luis Aguado ◽  
Francisco J. Román ◽  
Sonia Rodríguez ◽  
Teresa Diéguez-Risco ◽  
Verónica Romero-Ferreiro ◽  
...  

AbstractThe possibility that facial expressions of emotion change the affective valence of faces through associative learning was explored using facial electromyography (EMG). In Experiment 1, EMG activity was registered while the participants (N = 57) viewed sequences of neutral faces (Stimulus 1 or S1) changing to either a happy or an angry expression (Stimulus 2 or S2). As a consequence of learning, participants who showed patterning of facial responses in the presence of angry and happy faces, that is, higher Corrugator Supercilii (CS) activity in the presence of angry faces and higher Zygomaticus Major (ZM) activity in the presence of happy faces, showed also a similar pattern when viewing the corresponding S1 faces. Explicit evaluations made by an independent sample of participants (Experiment 2) showed that evaluation of S1 faces was changed according to the emotional expression with which they had been associated. These results are consistent with an interpretation of rapid facial reactions to faces as affective responses that reflect the valence of the stimulus and that are sensitive to learned changes in the affective meaning of faces.


2009 ◽  
Vol 26 (5) ◽  
pp. 475-488 ◽  
Author(s):  
Steven R. Livingstone ◽  
William Forde Thompson ◽  
Frank A. Russo

FACIAL EXPRESSIONS ARE USED IN MUSIC PERFORMANCE to communicate structural and emotional intentions. Exposure to emotional facial expressions also may lead to subtle facial movements that mirror those expressions. Seven participants were recorded with motion capture as they watched and imitated phrases of emotional singing. Four different participants were recorded using facial electromyography (EMG) while performing the same task. Participants saw and heard recordings of musical phrases sung with happy, sad, and neutral emotional connotations. They then imitated the target stimulus, paying close attention to the emotion expressed. Facial expressions were monitored during four epochs: (a) during the target; (b) prior to their imitation; (c) during their imitation; and (d) after their imitation. Expressive activity was observed in all epochs, implicating a role of facial expressions in the perception, planning, production, and post-production of emotional singing.


2021 ◽  
Author(s):  
Katlyn Peck

When individuals are presented with emotional facial expressions they spontaneously react with brief, distinct facial movements that ‘mimic’ the presented faces. While the effects of facial mimicry on emotional perception and social bonding have been well documented, the role of facial attractiveness on the elicitation of facial mimicry is unknown. We hypothesized that facial mimicry would increase with more attractive faces. Facial movements were recorded with electromyography upon presentation of averaged and original stimuli while ratings of attractiveness and intensity were obtained. In line with existing findings, emotionally congruent responses were observed in relevant facial muscle regions. Unexpectedly, the strength of observers’ facial mimicry responses decreased with more averaged faces, despite being rated perceptually as more attractive. These findings suggest that facial attractiveness moderates the degree of facial mimicry muscle movements elicited in observers. The relationship between averageness, attractiveness and mimicry is discussed in light of this counterintuitive finding.


2021 ◽  
pp. 1-11
Author(s):  
Ivane Nuel ◽  
Marie-Pierre Fayant ◽  
Theodore Alexopoulos

Abstract. The approach-aversion effect refers to a devaluation of approaching (vs. static) stimuli and is attributable to the fact that being approached is threatening. However, the explanation and the generalizability of this effect still remain unclear. To fill this gap, we provide a powerful test of the approach-aversion effect using Virtual Reality. Participants evaluated approaching and static virtual individuals for which we manipulated the threatening nature via their emotional facial expressions (Experiment 1), their group membership (Experiment 2), and the agency of their movements (Experiment 3). The results suggest a general approach-aversion effect which is attenuated when the self (vs. the target) initiates the movement. We thus bring convergent evidence that being approached is threatening.


Sign in / Sign up

Export Citation Format

Share Document