scholarly journals Emotion recognition using Kinect motion capture data of human gaits

PeerJ ◽  
2016 ◽  
Vol 4 ◽  
pp. e2364 ◽  
Author(s):  
Shun Li ◽  
Liqing Cui ◽  
Changye Zhu ◽  
Baobin Li ◽  
Nan Zhao ◽  
...  

Automatic emotion recognition is of great value in many applications, however, to fully display the application value of emotion recognition, more portable, non-intrusive, inexpensive technologies need to be developed. Human gaits could reflect the walker’s emotional state, and could be an information source for emotion recognition. This paper proposed a novel method to recognize emotional state through human gaits by using Microsoft Kinect, a low-cost, portable, camera-based sensor. Fifty-nine participants’ gaits under neutral state, induced anger and induced happiness were recorded by two Kinect cameras, and the original data were processed through joint selection, coordinate system transformation, sliding window gauss filtering, differential operation, and data segmentation. Features of gait patterns were extracted from 3-dimentional coordinates of 14 main body joints by Fourier transformation and Principal Component Analysis (PCA). The classifiers NaiveBayes, RandomForests, LibSVM and SMO (Sequential Minimal Optimization) were trained and evaluated, and the accuracy of recognizing anger and happiness from neutral state achieved 80.5% and 75.4%. Although the results of distinguishing angry and happiness states were not ideal in current study, it showed the feasibility of automatically recognizing emotional states from gaits, with the characteristics meeting the application requirements.

2015 ◽  
Author(s):  
Shun Li ◽  
Liqing Cui ◽  
Changye Zhu ◽  
Nan Zhao ◽  
Baobin Li ◽  
...  

Emotion recognition can improve the quality of patient care, product development and human-machine interaction.Psychological studies indicate that emotional state can be expressed in the way people walk,and the human gait can be used to reveal a person's emotional state.This paper proposes a novel method to do emotion recognition by using Microsoft Kinect to record gait patterns and train machine learning algorithms for emotion recognition. 59 subjects are recruited, and their gait patterns are recorded by two Kinect cameras.Joint selection, coordinate system transformation, sliding window gauss filtering,differential operation, and data segmentation are used for data preprocessing.We run Fourier transformation to extract features from the gait patterns and utilize Principal Component Analysis(PCA) for feature selection. By using NaiveBayes, RandomForests, LibSVM and SMO classifiers, the accuracy of recognition between natural and angry emotions can reach 80%,and the accuracy of recognition between natural and happy emotions can reach above 70%.The result indicates that Kinect can be used in the recognition of emotions with fairly well performance.


2015 ◽  
Author(s):  
Shun Li ◽  
Liqing Cui ◽  
Changye Zhu ◽  
Nan Zhao ◽  
Baobin Li ◽  
...  

Emotion recognition can improve the quality of patient care, product development and human-machine interaction.Psychological studies indicate that emotional state can be expressed in the way people walk,and the human gait can be used to reveal a person's emotional state.This paper proposes a novel method to do emotion recognition by using Microsoft Kinect to record gait patterns and train machine learning algorithms for emotion recognition. 59 subjects are recruited, and their gait patterns are recorded by two Kinect cameras.Joint selection, coordinate system transformation, sliding window gauss filtering,differential operation, and data segmentation are used for data preprocessing.We run Fourier transformation to extract features from the gait patterns and utilize Principal Component Analysis(PCA) for feature selection. By using NaiveBayes, RandomForests, LibSVM and SMO classifiers, the accuracy of recognition between natural and angry emotions can reach 80%,and the accuracy of recognition between natural and happy emotions can reach above 70%.The result indicates that Kinect can be used in the recognition of emotions with fairly well performance.


Entropy ◽  
2019 ◽  
Vol 21 (7) ◽  
pp. 646 ◽  
Author(s):  
Tomasz Sapiński ◽  
Dorota Kamińska ◽  
Adam Pelikant ◽  
Gholamreza Anbarjafari

Automatic emotion recognition has become an important trend in many artificial intelligence (AI) based applications and has been widely explored in recent years. Most research in the area of automated emotion recognition is based on facial expressions or speech signals. Although the influence of the emotional state on body movements is undeniable, this source of expression is still underestimated in automatic analysis. In this paper, we propose a novel method to recognise seven basic emotional states—namely, happy, sad, surprise, fear, anger, disgust and neutral—utilising body movement. We analyse motion capture data under seven basic emotional states recorded by professional actor/actresses using Microsoft Kinect v2 sensor. We propose a new representation of affective movements, based on sequences of body joints. The proposed algorithm creates a sequential model of affective movement based on low level features inferred from the spacial location and the orientation of joints within the tracked skeleton. In the experimental results, different deep neural networks were employed and compared to recognise the emotional state of the acquired motion sequences. The experimental results conducted in this work show the feasibility of automatic emotion recognition from sequences of body gestures, which can serve as an additional source of information in multimodal emotion recognition.


2015 ◽  
Author(s):  
Shun Li ◽  
Liqing Cui ◽  
Changye Zhu ◽  
Nan Zhao ◽  
Baobin Li ◽  
...  

Emotion recognition can improve the quality of patient care, product development and human-machine interaction.Psychological studies indicate that emotional state can be expressed in the way people walk,and the human gait can be used to reveal a person's emotional state.This paper proposes a novel method to do emotion recognition by using Microsoft Kinect to record gait patterns and train machine learning algorithms for emotion recognition. 59 subjects are recruited, and their gait patterns are recorded by two Kinect cameras.Joint selection, coordinate system transformation, sliding window gauss filtering,differential operation, and data segmentation are used for data preprocessing.We run Fourier transformation to extract features from the gait patterns and utilize Principal Component Analysis(PCA) for feature selection. By using NaiveBayes, RandomForests, LibSVM and SMO classifiers, the accuracy of recognition between natural and angry emotions can reach 80%,and the accuracy of recognition between natural and happy emotions can reach above 70%.The result indicates that Kinect can be used in the recognition of emotions with fairly well performance.


2021 ◽  
Author(s):  
Talieh Seyed Tabtabae

Automatic Emotion Recognition (AER) is an emerging research area in the Human-Computer Interaction (HCI) field. As Computers are becoming more and more popular every day, the study of interaction between humans (users) and computers is catching more attention. In order to have a more natural and friendly interface between humans and computers, it would be beneficial to give computers the ability to recognize situations the same way a human does. Equipped with an emotion recognition system, computers will be able to recognize their users' emotional state and show the appropriate reaction to that. In today's HCI systems, machines can recognize the speaker and also content of the speech, using speech recognition and speaker identification techniques. If machines are equipped with emotion recognition techniques, they can also know "how it is said" to react more appropriately, and make the interaction more natural. One of the most important human communication channels is the auditory channel which carries speech and vocal intonation. In fact people can perceive each other's emotional state by the way they talk. Therefore in this work the speech signals are analyzed in order to set up an automatic system which recognizes the human emotional state. Six discrete emotional states have been considered and categorized in this research: anger, happiness, fear, surprise, sadness, and disgust. A set of novel spectral features are proposed in this contribution. Two approaches are applied and the results are compared. In the first approach, all the acoustic features are extracted from consequent frames along the speech signals. The statistical values of features are considered to constitute the features vectors. Suport Vector Machine (SVM), which is a relatively new approach in the field of machine learning is used to classify the emotional states. In the second approach, spectral features are extracted from non-overlapping logarithmically-spaced frequency sub-bands. In order to make use of all the extracted information, sequence discriminant SVMs are adopted. The empirical results show that the employed techniques are very promising.


Animals ◽  
2019 ◽  
Vol 9 (10) ◽  
pp. 757 ◽  
Author(s):  
Marta Brscic ◽  
Nina Dam Otten ◽  
Barbara Contiero ◽  
Marlene Katharina Kirchner

Assessing emotional states of dairy calves is an essential part of welfare assessment, but standardized protocols are absent. The present study aims at assessing the emotional states of dairy calves and establishing a reliable standard procedure with Qualitative Behavioral Assessment (QBA) and 20 defined terms. Video material was used to compare multiple observer results. Further, live observations were performed on 49 dairy herds in Denmark and Italy. Principal Component Analysis (PCA) identified observer agreement and QBA dimensions (PC). For achieving overall welfare judgment, PC1-scores were turned into the Welfare Quality (WQ) criterion ‘Positive Emotional State’. Finally, farm factors’ influence on the WQ criterion was evaluated by mixed linear models. PCA summarized QBA descriptors as PC1 ‘Valence’ and PC2 ‘Arousal’ (explained variation 40.3% and 13.3%). The highest positive descriptor loadings on PC1 was Happy (0.92) and Nervous (0.72) on PC2. The WQ-criterion score (WQ-C12) was on average 51.1 ± 9.0 points (0: worst to 100: excellent state) and ‘Number of calves’, ‘Farming style’, and ‘Breed’ explained 18% of the variability of it. We conclude that the 20 terms achieved a high portion of explained variation providing a differentiated view on the emotional state of calves. The defined term list proved to need good training for observer agreement.


2020 ◽  
Vol 0 (0) ◽  
Author(s):  
Mehmet Akif Ozdemir ◽  
Murside Degirmenci ◽  
Elif Izci ◽  
Aydin Akan

AbstractThe emotional state of people plays a key role in physiological and behavioral human interaction. Emotional state analysis entails many fields such as neuroscience, cognitive sciences, and biomedical engineering because the parameters of interest contain the complex neuronal activities of the brain. Electroencephalogram (EEG) signals are processed to communicate brain signals with external systems and make predictions over emotional states. This paper proposes a novel method for emotion recognition based on deep convolutional neural networks (CNNs) that are used to classify Valence, Arousal, Dominance, and Liking emotional states. Hence, a novel approach is proposed for emotion recognition with time series of multi-channel EEG signals from a Database for Emotion Analysis and Using Physiological Signals (DEAP). We propose a new approach to emotional state estimation utilizing CNN-based classification of multi-spectral topology images obtained from EEG signals. In contrast to most of the EEG-based approaches that eliminate spatial information of EEG signals, converting EEG signals into a sequence of multi-spectral topology images, temporal, spectral, and spatial information of EEG signals are preserved. The deep recurrent convolutional network is trained to learn important representations from a sequence of three-channel topographical images. We have achieved test accuracy of 90.62% for negative and positive Valence, 86.13% for high and low Arousal, 88.48% for high and low Dominance, and finally 86.23% for like–unlike. The evaluations of this method on emotion recognition problem revealed significant improvements in the classification accuracy when compared with other studies using deep neural networks (DNNs) and one-dimensional CNNs.


2020 ◽  
Vol 13 (4) ◽  
pp. 4-24 ◽  
Author(s):  
V.A. Barabanschikov ◽  
E.V. Suvorova

The article is devoted to the results of approbation of the Geneva Emotion Recognition Test (GERT), a Swiss method for assessing dynamic emotional states, on Russian sample. Identification accuracy and the categorical fields’ structure of emotional expressions of a “living” face are analysed. Similarities and differences in the perception of affective groups of dynamic emotions in the Russian and Swiss samples are considered. A number of patterns of recognition of multi-modal expressions with changes in valence and arousal of emotions are described. Differences in the perception of dynamics and statics of emotional expressions are revealed. GERT method confirmed it’s high potential for solving a wide range of academic and applied problems.


2021 ◽  
Vol 21 (1) ◽  
Author(s):  
Vera Bühlmann ◽  
Susanne Schlüter-Müller ◽  
Lukas Fürer ◽  
Martin Steppan ◽  
Marc Birkhölzer ◽  
...  

Abstract Introduction Patient suicidality is a frequently experienced topic for psychotherapists. Especially adolescents with borderline personality pathology (BPP) often exhibit suicidal tendencies. Previous research which examined therapists’ countertransference towards suicidal patients suggested that therapists are negatively affected and distressed by them. We hypothesize that this emotional response of the therapists is related to specific sessions in which suicidality came up as a topic. Accordingly, the objective of this study consists in examining therapists’ emotional state on a session level of analysis. Methods The sample consisted of N = 21 adolescents (age 13–19 years) with BPD or subthreshold BPD. Therapists’ emotional states were measured in n = 418 sessions using the Session Evaluation Questionnaire. Principal component analysis was used to reduce dimensionality of the therapist response. The emotional states were compared depending on whether suicidality has been addressed in the session (SS) or not (NSS). Results Two components could be identified. Firstly, therapists were more aroused, excited, afraid, angry and uncertain after SS than after NSS. Secondly, therapists were more aroused, excited, definite and pleased after SS than after NSS. Discussion: Suicidality does not always have to be a burden for therapists: Both a “distress” and an “eustress” component occur in this context from which the latter is supposed to help clinicians master a difficult situation. Since countertransference feelings are often not fully conscious, it is necessary to do research on therapists’ emotional states after sessions in which suicidality is addressed. This is crucial to both prevent the therapeutic process from being endangered and preserve clinicians’ mental health. Clinical implications and limitations are discussed.


Metabolites ◽  
2020 ◽  
Vol 10 (3) ◽  
pp. 84 ◽  
Author(s):  
Monique A.M. Smeets ◽  
Egge A.E. Rosing ◽  
Doris M. Jacobs ◽  
Ewoud van Velzen ◽  
Jean H. Koek ◽  
...  

Chemical communication is common among animals. In humans, the chemical basis of social communication has remained a black box, despite psychological and neural research showing distinctive physiological, behavioral, and neural consequences of body odors emitted during emotional states like fear and happiness. We used a multidisciplinary approach to examine whether molecular cues could be associated with an emotional state in the emitter. Our research revealed that the volatile molecules transmitting different emotions to perceivers also have objectively different chemical properties. Chemical analysis of underarm sweat collected from the same donors in fearful, happy, and emotionally neutral states was conducted using untargeted two-dimensional (GC×GC) coupled with time of flight (ToF) MS-based profiling. Based on the multivariate statistical analyses, we find that the pattern of chemical volatiles (N = 1655 peaks) associated with fearful state is clearly different from that associated with (pleasant) neutral state. Happy sweat is also significantly different from the other states, chemically, but shows a bipolar pattern of overlap with fearful as well as neutral state. Candidate chemical classes associated with emotional and neutral sweat have been identified, specifically, linear aldehydes, ketones, esters, and cyclic molecules (5 rings). This research constitutes a first step toward identifying the chemical fingerprints of emotion.


Sign in / Sign up

Export Citation Format

Share Document