scholarly journals Classifying Facial Expressions Based on Topo-Feature Representation

10.5772/6185 ◽  
2008 ◽  
Author(s):  
Xiaozhou Wei ◽  
Johnny Loi ◽  
Lijun Yi
Author(s):  
Guanbin Li ◽  
Xin Zhu ◽  
Yirui Zeng ◽  
Qing Wang ◽  
Liang Lin

Facial action unit (AU) recognition is a crucial task for facial expressions analysis and has attracted extensive attention in the field of artificial intelligence and computer vision. Existing works have either focused on designing or learning complex regional feature representations, or delved into various types of AU relationship modeling. Albeit with varying degrees of progress, it is still arduous for existing methods to handle complex situations. In this paper, we investigate how to integrate the semantic relationship propagation between AUs in a deep neural network framework to enhance the feature representation of facial regions, and propose an AU semantic relationship embedded representation learning (SRERL) framework. Specifically, by analyzing the symbiosis and mutual exclusion of AUs in various facial expressions, we organize the facial AUs in the form of structured knowledge-graph and integrate a Gated Graph Neural Network (GGNN) in a multi-scale CNN framework to propagate node information through the graph for generating enhanced AU representation. As the learned feature involves both the appearance characteristics and the AU relationship reasoning, the proposed model is more robust and can cope with more challenging cases, e.g., illumination change and partial occlusion. Extensive experiments on the two public benchmarks demonstrate that our method outperforms the previous work and achieves state of the art performance.


2018 ◽  
Vol 7 (2) ◽  
pp. 568
Author(s):  
Gunavathi H S ◽  
Siddappa M

Over the last few years, facial expression recognition is an active research field, which has an extensive range of applications in the area of social interaction, social intelligence, autism detection and Human-computer interaction. In this paper, a   robust hybrid framework is presented to recognize the facial expressions, which enhances the efficiency and speed of recognition system by extracting significant features of a face. In the proposed framework, feature representation and extraction are done by using Local Binary Patterns (LBP) and Histogram of Oriented Gradients (HOG). Later, the dimensionalities of the obtained features are reduced using Compressive Sensing (CS) algorithm and classified using multiclass SVM classifier. We investigated the performance of the proposed hybrid framework on two public databases such as CK+ and JAFFE data sets. The investigational results show that the proposed hybrid framework is a promising framework for recognizing and identifying facial expressions with varying illuminations and poses in real time.


2003 ◽  
Vol 17 (3) ◽  
pp. 113-123 ◽  
Author(s):  
Jukka M. Leppänen ◽  
Mirja Tenhunen ◽  
Jari K. Hietanen

Abstract Several studies have shown faster choice-reaction times to positive than to negative facial expressions. The present study examined whether this effect is exclusively due to faster cognitive processing of positive stimuli (i.e., processes leading up to, and including, response selection), or whether it also involves faster motor execution of the selected response. In two experiments, response selection (onset of the lateralized readiness potential, LRP) and response execution (LRP onset-response onset) times for positive (happy) and negative (disgusted/angry) faces were examined. Shorter response selection times for positive than for negative faces were found in both experiments but there was no difference in response execution times. Together, these results suggest that the happy-face advantage occurs primarily at premotoric processing stages. Implications that the happy-face advantage may reflect an interaction between emotional and cognitive factors are discussed.


2010 ◽  
Vol 24 (3) ◽  
pp. 186-197 ◽  
Author(s):  
Sandra J. E. Langeslag ◽  
Jan W. Van Strien

It has been suggested that emotion regulation improves with aging. Here, we investigated age differences in emotion regulation by studying modulation of the late positive potential (LPP) by emotion regulation instructions. The electroencephalogram of younger (18–26 years) and older (60–77 years) adults was recorded while they viewed neutral, unpleasant, and pleasant pictures and while they were instructed to increase or decrease the feelings that the emotional pictures elicited. The LPP was enhanced when participants were instructed to increase their emotions. No age differences were observed in this emotion regulation effect, suggesting that emotion regulation abilities are unaffected by aging. This contradicts studies that measured emotion regulation by self-report, yet accords with studies that measured emotion regulation by means of facial expressions or psychophysiological responses. More research is needed to resolve the apparent discrepancy between subjective self-report and objective psychophysiological measures.


Crisis ◽  
2020 ◽  
pp. 1-8
Author(s):  
Chao S. Hu ◽  
Jiajia Ji ◽  
Jinhao Huang ◽  
Zhe Feng ◽  
Dong Xie ◽  
...  

Abstract. Background: High school and university teachers need to advise students against attempting suicide, the second leading cause of death among 15–29-year-olds. Aims: To investigate the role of reasoning and emotion in advising against suicide. Method: We conducted a study with 130 students at a university that specializes in teachers' education. Participants sat in front of a camera, videotaping their advising against suicide. Three raters scored their transcribed advice on "wise reasoning" (i.e., expert forms of reasoning: considering a variety of conditions, awareness of the limitation of one's knowledge, taking others' perspectives). Four registered psychologists experienced in suicide prevention techniques rated the transcripts on the potential for suicide prevention. Finally, using the software Facereader 7.1, we analyzed participants' micro-facial expressions during advice-giving. Results: Wiser reasoning and less disgust predicted higher potential for suicide prevention. Moreover, higher potential for suicide prevention was associated with more surprise. Limitations: The actual efficacy of suicide prevention was not assessed. Conclusion: Wise reasoning and counter-stereotypic ideas that trigger surprise probably contribute to the potential for suicide prevention. This advising paradigm may help train teachers in advising students against suicide, measuring wise reasoning, and monitoring a harmful emotional reaction, that is, disgust.


2016 ◽  
Vol 37 (1) ◽  
pp. 16-23 ◽  
Author(s):  
Chit Yuen Yi ◽  
Matthew W. E. Murry ◽  
Amy L. Gentzler

Abstract. Past research suggests that transient mood influences the perception of facial expressions of emotion, but relatively little is known about how trait-level emotionality (i.e., temperament) may influence emotion perception or interact with mood in this process. Consequently, we extended earlier work by examining how temperamental dimensions of negative emotionality and extraversion were associated with the perception accuracy and perceived intensity of three basic emotions and how the trait-level temperamental effect interacted with state-level self-reported mood in a sample of 88 adults (27 men, 18–51 years of age). The results indicated that higher levels of negative mood were associated with higher perception accuracy of angry and sad facial expressions, and higher levels of perceived intensity of anger. For perceived intensity of sadness, negative mood was associated with lower levels of perceived intensity, whereas negative emotionality was associated with higher levels of perceived intensity of sadness. Overall, our findings added to the limited literature on adult temperament and emotion perception.


1998 ◽  
Vol 74 (1) ◽  
pp. 272-279 ◽  
Author(s):  
Chris L. Kleinke ◽  
Thomas R. Peterson ◽  
Thomas R. Rutledge
Keyword(s):  

1996 ◽  
Vol 41 (11) ◽  
pp. 1099-1100
Author(s):  
Craig A. Smith
Keyword(s):  

Sign in / Sign up

Export Citation Format

Share Document