Emotion Recognition Using Smart Watch Sensor Data: A Mixed-Design Study (Preprint)
BACKGROUND Research in psychology has shown that the way a person walks reflects that person's current mood (or emotional state). Recent studies have started using smartphones to detect emotional states from movement data. OBJECTIVE This study investigates the use of movement sensor data from a smart watch to infer an individual's emotional state. We present our findings on a user study with 50 participants. METHODS The experimental design is a mixed-design study; within-subjects (emotions; happy, sad, neutral) and between-subjects (stimulus type: audio visual "movie clips", audio "music clips"). Each participant experienced both emotions in a single stimulus type. All participants walked 250m while wearing a smart watch on one wrist and a heart rate monitor strap on their chest. They also had to answer a short questionnaire (20 items; PANAS) before and after experiencing each emotion. The heart rate monitor serves as a supplementary information to our data. We performed time-series analysis on the data from the smart watch and a t-test on the questionnaire items to measure the change in emotional state. The heart rate data was analyzed using one-way ANOVA. We extracted features from the time-series using sliding windows and used the features to train and validate classifiers that determine an individual's emotion. RESULTS We had 50 young adults participate in our study, with 49 included for the affective PANAS questionnaire and all for the feature extraction. Participants reported feeling less negative affect after watching sad videos or after listening to the sad music, P < .006. For the task of emotion recognition using classifiers, our results show that the personal models outperformed personal baselines, and achieve median accuracies higher than 78% for all conditions of the design study for the binary classification of happiness vs sadness. CONCLUSIONS Our findings show that we are able to detect the changes in emotional state with data obtained from the smartwatch as well as behavioral responses. Together with the high accuracies achieved across all users for the classification of happy vs sad emotional states, this is further evidence for the hypothesis that movement sensor data can be used for emotion recognition.