scholarly journals Exploring Emotion Recognition in Adults and Adolescents with Anorexia Nervosa Using a Body Motion Paradigm

2015 ◽  
Vol 23 (4) ◽  
pp. 262-268 ◽  
Author(s):  
Katie Lang ◽  
Marcela Marin Dapelo ◽  
Mizanur Khondoker ◽  
Robin Morris ◽  
Simon Surguladze ◽  
...  
Author(s):  
Kevser Nalbant ◽  
Bilge Merve Kalaycı ◽  
Devrim Akdemir ◽  
Sinem Akgül ◽  
Nuray Kanbur

2009 ◽  
Vol 16 (4) ◽  
pp. 348-356 ◽  
Author(s):  
Amy Harrison ◽  
Sarah Sullivan ◽  
Kate Tchanturia ◽  
Janet Treasure

2015 ◽  
Vol 24 (1) ◽  
pp. 34-42 ◽  
Author(s):  
Marcela Marin Dapelo ◽  
Simon Surguladze ◽  
Robin Morris ◽  
Kate Tchanturia

2018 ◽  
Author(s):  
Olga Perepelkina ◽  
Eva Kazimirova ◽  
Maria Konstantinova

Emotion expression encompasses various types of information, including face and eye movement, voice and body motion. Most of the studies in automated affective recognition use faces as stimuli, less often they include speech and even more rarely gestures. Emotions collected from real conversations are difficult to classify using one channel. That is why multimodal techniques have recently become more popular in automatic emotion recognition. Multimodal databases that include audio, video, 3D motion capture and physiology data are quite rare. We collected The Russian Acted Multimodal Affective Set (RAMAS) the first multimodal corpus in Russian language. Our database contains approximately 7 hours of high-quality closeup video recordings of subjects faces, speech, motion-capture data and such physiological signals as electro-dermal activity and photoplethysmogram. The subjects were 10 actors who played out interactive dyadic scenarios. Each scenario involved one of the basic emotions: Anger, Sadness, Disgust, Happiness, Fear or Surprise, and some characteristics of social interaction like Domination and Submission. In order to note emotions that subjects really felt during the process we asked them to fill in short questionnaires (self-reports) after each played scenario. The records were marked by 21 annotators (at least five annotators marked each scenario). We present our multimodal data collection, annotation process, inter-rater agreement analysis and the comparison between self-reports and received annotations. RAMAS is an open database that provides research community with multimodal data of faces, speech, gestures and physiology interrelation. Such material is useful for various investigations and automatic affective systems development.


Author(s):  
Kyriaki Kaza ◽  
Athanasios Psaltis ◽  
Kiriakos Stefanidis ◽  
Konstantinos C. Apostolakis ◽  
Spyridon Thermos ◽  
...  

2017 ◽  
Vol 25 (6) ◽  
pp. 595-600 ◽  
Author(s):  
Marcela Marin Dapelo ◽  
Simon Surguladze ◽  
Robin Morris ◽  
Kate Tchanturia

2019 ◽  
Vol 52 (6) ◽  
pp. 691-700 ◽  
Author(s):  
Lisa Dinkler ◽  
Sandra Rydberg Dobrescu ◽  
Maria Råstam ◽  
I. Carina Gillberg ◽  
Christopher Gillberg ◽  
...  

2020 ◽  
Vol 53 (6) ◽  
pp. 945-953 ◽  
Author(s):  
Mira A. Preis ◽  
Katja Schlegel ◽  
Linda Stoll ◽  
Maximilian Blomberg ◽  
Hagen Schmidt ◽  
...  

2020 ◽  
Vol 9 (4) ◽  
pp. 1057 ◽  
Author(s):  
Jess Kerr-Gaffney ◽  
Luke Mason ◽  
Emily Jones ◽  
Hannah Hayward ◽  
Jumana Ahmad ◽  
...  

Difficulties in socio-emotional functioning are proposed to contribute to the development and maintenance of anorexia nervosa (AN). This study aimed to examine emotion recognition abilities in individuals in the acute and recovered stages of AN compared to healthy controls (HCs). A second aim was to examine whether attention to faces and comorbid psychopathology predicted emotion recognition abilities. The films expressions task was administered to 148 participants (46 AN, 51 recovered AN, 51 HC) to assess emotion recognition, during which attention to faces was recorded using eye-tracking. Comorbid psychopathology was assessed using self-report questionnaires and the Autism Diagnostic Observation Schedule–2nd edition (ADOS-2). No significant differences in emotion recognition abilities or attention to faces were found between groups. However, individuals with a lifetime history of AN who scored above the clinical cut-off on the ADOS-2 displayed poorer emotion recognition performance than those scoring below cut-off and HCs. ADOS-2 scores significantly predicted emotion recognition abilities while controlling for group membership and intelligence. Difficulties in emotion recognition appear to be associated with high autism spectrum disorder (ASD) traits, rather than a feature of AN. Whether individuals with AN and high ASD traits may require different treatment strategies or adaptations is a question for future research.


Author(s):  
Maximilian Blomberg ◽  
Katja Schlegel ◽  
Linda Stoll ◽  
Hagen Febry ◽  
Wally Wünsch‐Leiteritz ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document