Virtual Studio System and Facial Emotion-Expression

Author(s):  
Hideki Aoyama ◽  
Ryo Haginoya ◽  
Umezawa

Directors of TV programs, commercial programs, etc. usually convey their intentions to actors and production staffs using storyboards. However, it is difficult to perfectly and strictly convey director’s intentions to them since storyboards indicate only moment images of scenes. Directors then need much time to convey their intentions. In order to solve such problems, a system to automatically generate animation storyboards: moving images, have been developed in this study. The system is called “Virtual Studio System”. The system analyzes a scenario written by a director in natural language and automatically creates moving images. The system enables one to easily change the result: moving images, by changing the scenario in natural language. In addition, a method to make facial expressions of characters in the virtual system has been developed. With this system, anyone can easily make and edit animation storyboards representing a scenario.

2021 ◽  
pp. 1-10
Author(s):  
Daniel T. Burley ◽  
Christopher W. Hobson ◽  
Dolapo Adegboye ◽  
Katherine H. Shelton ◽  
Stephanie H.M. van Goozen

Abstract Impaired facial emotion recognition is a transdiagnostic risk factor for a range of psychiatric disorders. Childhood behavioral difficulties and parental emotional environment have been independently associated with impaired emotion recognition; however, no study has examined the contribution of these factors in conjunction. We measured recognition of negative (sad, fear, anger), neutral, and happy facial expressions in 135 children aged 5–7 years referred by their teachers for behavioral problems. Parental emotional environment was assessed for parental expressed emotion (EE) – characterized by negative comments, reduced positive comments, low warmth, and negativity towards their child – using the 5-minute speech sample. Child behavioral problems were measured using the teacher-informant Strengths and Difficulties Questionnaire (SDQ). Child behavioral problems and parental EE were independently associated with impaired recognition of negative facial expressions specifically. An interactive effect revealed that the combination of both factors was associated with the greatest risk for impaired recognition of negative faces, and in particular sad facial expressions. No relationships emerged for the identification of happy facial expressions. This study furthers our understanding of multidimensional processes associated with the development of facial emotion recognition and supports the importance of early interventions that target this domain.


i-Perception ◽  
2021 ◽  
Vol 12 (2) ◽  
pp. 204166952110095
Author(s):  
Elmeri Syrjänen ◽  
Håkan Fischer ◽  
Marco Tullio Liuzza ◽  
Torun Lindholm ◽  
Jonas K. Olofsson

How do valenced odors affect the perception and evaluation of facial expressions? We reviewed 25 studies published from 1989 to 2020 on cross-modal behavioral effects of odors on the perception of faces. The results indicate that odors may influence facial evaluations and classifications in several ways. Faces are rated as more arousing during simultaneous odor exposure, and the rated valence of faces is affected in the direction of the odor valence. For facial classification tasks, in general, valenced odors, whether pleasant or unpleasant, decrease facial emotion classification speed. The evidence for valence congruency effects was inconsistent. Some studies found that exposure to a valenced odor facilitates the processing of a similarly valenced facial expression. The results for facial evaluation were mirrored in classical conditioning studies, as faces conditioned with valenced odors were rated in the direction of the odor valence. However, the evidence of odor effects was inconsistent when the task was to classify faces. Furthermore, using a z-curve analysis, we found clear evidence for publication bias. Our recommendations for future research include greater consideration of individual differences in sensation and cognition, individual differences (e.g., differences in odor sensitivity related to age, gender, or culture), establishing standardized experimental assessments and stimuli, larger study samples, and embracing open research practices.


2017 ◽  
Vol 29 (5) ◽  
pp. 1749-1761 ◽  
Author(s):  
Johanna Bick ◽  
Rhiannon Luyster ◽  
Nathan A. Fox ◽  
Charles H. Zeanah ◽  
Charles A. Nelson

AbstractWe examined facial emotion recognition in 12-year-olds in a longitudinally followed sample of children with and without exposure to early life psychosocial deprivation (institutional care). Half of the institutionally reared children were randomized into foster care homes during the first years of life. Facial emotion recognition was examined in a behavioral task using morphed images. This same task had been administered when children were 8 years old. Neutral facial expressions were morphed with happy, sad, angry, and fearful emotional facial expressions, and children were asked to identify the emotion of each face, which varied in intensity. Consistent with our previous report, we show that some areas of emotion processing, involving the recognition of happy and fearful faces, are affected by early deprivation, whereas other areas, involving the recognition of sad and angry faces, appear to be unaffected. We also show that early intervention can have a lasting positive impact, normalizing developmental trajectories of processing negative emotions (fear) into the late childhood/preadolescent period.


2021 ◽  
Vol 35 (Supplement A) ◽  
pp. 132-148
Author(s):  
Tahira Gulamani ◽  
Achala H. Rodrigo ◽  
Amanda A. Uliaszek ◽  
Anthony C. Ruocco

Emotion perception biases may precipitate problematic interpersonal interactions in families affected with borderline personality disorder (BPD) and lead to conflictual relationships. In the present study, the authors investigated the familial aggregation of facial emotion recognition biases for neutral, happy, sad, fearful, and angry expressions in probands with BPD (n = 89), first-degree biological relatives (n = 67), and healthy controls (n = 87). Relatives showed comparable accuracy and response times to controls in recognizing negative emotions in aggregate and most discrete emotions. For sad expressions, both probands and relatives displayed slower response latencies, and they were more likely than controls to perceive sad expressions as fearful. Nonpsychiatrically affected relatives were slower than controls in responding to negative emotional expressions in aggregate, and fearful and sad facial expressions more specifically. These findings uncover potential biases in perceiving sad and fearful facial expressions that may be transmitted in families affected with BPD.


Facial emotion analysis is the basic idea to train the system to understand the different facial expressions of human beings. The Facial expressions are recorded by the use of camera which is attached to user device. Additionally this project will be helpful for the online marketing of the products as it can detect the facial expressions and sentiment of the person. It is the study of people sentiment, opinions and emotions. Sentiment analysis is the method by which information is taken from the facial expressions of people in regard to different situations. The main aim is to read the facial expressions of the human beings using a good resolution camera so that the machine can identify the human sentiments. Convolutional neural network is used as an existing system which is unsupervised neural network to replace that with a supervised mechanism which is called supervised neural network. It can be used in gaming sector, unlock smart phones, automated facial language translation etc.


2021 ◽  
Vol 12 ◽  
Author(s):  
Paula J. Webster ◽  
Shuo Wang ◽  
Xin Li

Different styles of social interaction are one of the core characteristics of autism spectrum disorder (ASD). Social differences among individuals with ASD often include difficulty in discerning the emotions of neurotypical people based on their facial expressions. This review first covers the rich body of literature studying differences in facial emotion recognition (FER) in those with ASD, including behavioral studies and neurological findings. In particular, we highlight subtle emotion recognition and various factors related to inconsistent findings in behavioral studies of FER in ASD. Then, we discuss the dual problem of FER – namely facial emotion expression (FEE) or the production of facial expressions of emotion. Despite being less studied, social interaction involves both the ability to recognize emotions and to produce appropriate facial expressions. How others perceive facial expressions of emotion in those with ASD has remained an under-researched area. Finally, we propose a method for teaching FER [FER teaching hierarchy (FERTH)] based on recent research investigating FER in ASD, considering the use of posed vs. genuine emotions and static vs. dynamic stimuli. We also propose two possible teaching approaches: (1) a standard method of teaching progressively from simple drawings and cartoon characters to more complex audio-visual video clips of genuine human expressions of emotion with context clues or (2) teaching in a field of images that includes posed and genuine emotions to improve generalizability before progressing to more complex audio-visual stimuli. Lastly, we advocate for autism interventionists to use FER stimuli developed primarily for research purposes to facilitate the incorporation of well-controlled stimuli to teach FER and bridge the gap between intervention and research in this area.


Informatics ◽  
2020 ◽  
Vol 7 (1) ◽  
pp. 6 ◽  
Author(s):  
Abdulrahman Alreshidi ◽  
Mohib Ullah

Facial emotion recognition is a crucial task for human-computer interaction, autonomous vehicles, and a multitude of multimedia applications. In this paper, we propose a modular framework for human facial emotions’ recognition. The framework consists of two machine learning algorithms (for detection and classification) that could be trained offline for real-time applications. Initially, we detect faces in the images by exploring the AdaBoost cascade classifiers. We then extract neighborhood difference features (NDF), which represent the features of a face based on localized appearance information. The NDF models different patterns based on the relationships between neighboring regions themselves instead of considering only intensity information. The study is focused on the seven most important facial expressions that are extensively used in day-to-day life. However, due to the modular design of the framework, it can be extended to classify N number of facial expressions. For facial expression classification, we train a random forest classifier with a latent emotional state that takes care of the mis-/false detection. Additionally, the proposed method is independent of gender and facial skin color for emotion recognition. Moreover, due to the intrinsic design of NDF, the proposed method is illumination and orientation invariant. We evaluate our method on different benchmark datasets and compare it with five reference methods. In terms of accuracy, the proposed method gives 13% and 24% better results than the reference methods on the static facial expressions in the wild (SFEW) and real-world affective faces (RAF) datasets, respectively.


2019 ◽  
Vol 29 (10) ◽  
pp. 1441-1451 ◽  
Author(s):  
Melina Nicole Kyranides ◽  
Kostas A. Fanti ◽  
Maria Petridou ◽  
Eva R. Kimonis

AbstractIndividuals with callous-unemotional (CU) traits show deficits in facial emotion recognition. According to preliminary research, this impairment may be due to attentional neglect to peoples’ eyes when evaluating emotionally expressive faces. However, it is unknown whether this atypical processing pattern is unique to established variants of CU traits or modifiable with intervention. This study examined facial affect recognition and gaze patterns among individuals (N = 80; M age = 19.95, SD = 1.01 years; 50% female) with primary vs secondary CU variants. These groups were identified based on repeated measurements of conduct problems, CU traits, and anxiety assessed in adolescence and adulthood. Accuracy and number of fixations on areas of interest (forehead, eyes, and mouth) while viewing six dynamic emotions were assessed. A visual probe was used to direct attention to various parts of the face. Individuals with primary and secondary CU traits were less accurate than controls in recognizing facial expressions across all emotions. Those identified in the low-anxious primary-CU group showed reduced overall fixations to fearful and painful facial expressions compared to those in the high-anxious secondary-CU group. This difference was not specific to a region of the face (i.e. eyes or mouth). Findings point to the importance of investigating both accuracy and eye gaze fixations, since individuals in the primary and secondary groups were only differentiated in the way they attended to specific facial expression. These findings have implications for differentiated interventions focused on improving facial emotion recognition with regard to attending and correctly identifying emotions.


2014 ◽  
Vol 26 (4) ◽  
pp. 253-259 ◽  
Author(s):  
Linette Lawlor-Savage ◽  
Scott R. Sponheim ◽  
Vina M. Goghari

BackgroundThe ability to accurately judge facial expressions is important in social interactions. Individuals with bipolar disorder have been found to be impaired in emotion recognition; however, the specifics of the impairment are unclear. This study investigated whether facial emotion recognition difficulties in bipolar disorder reflect general cognitive, or emotion-specific, impairments. Impairment in the recognition of particular emotions and the role of processing speed in facial emotion recognition were also investigated.MethodsClinically stable bipolar patients (n = 17) and healthy controls (n = 50) judged five facial expressions in two presentation types, time-limited and self-paced. An age recognition condition was used as an experimental control.ResultsBipolar patients’ overall facial recognition ability was unimpaired. However, patients’ specific ability to judge happy expressions under time constraints was impaired.ConclusionsFindings suggest a deficit in happy emotion recognition impacted by processing speed. Given the limited sample size, further investigation with a larger patient sample is warranted.


Sign in / Sign up

Export Citation Format

Share Document