scholarly journals Identification of Body Behaviors and Facial Expressions Associated with Induced Orthopedic Pain in Four Equine Pain Scales

Animals ◽  
2020 ◽  
Vol 10 (11) ◽  
pp. 2155
Author(s):  
Katrina Ask ◽  
Marie Rhodin ◽  
Lena-Mari Tamminen ◽  
Elin Hernlund ◽  
Pia Haubro Andersen

Equine orthopedic pain scales are targeted towards horses with moderate to severe orthopedic pain. Improved assessment of pain behavior and pain-related facial expressions at rest may refine orthopedic pain detection for mild lameness grades. Therefore, this study explored pain-related behaviors and facial expressions and sought to identify frequently occurring combinations. Orthopedic pain was induced by intra-articular LPS in eight horses, and objective movement asymmetry analyses were performed before and after induction together with pain assessments at rest. Three observers independently assessed horses in their box stalls, using four equine pain scales simultaneously. Increase in movement asymmetry after induction was used as a proxy for pain. Behaviors and facial expressions commonly co-occurred and were strongly associated with movement asymmetry. Posture-related scale items were the strongest predictors of movement asymmetry. Display of facial expressions at rest varied between horses but, when present, were strongly associated with movement asymmetry. Reliability of facial expression items was lower than reliability of behavioral items. These findings suggest that five body behaviors (posture, head position, location in the box stall, focus, and interactive behavior) should be included in a scale for live assessment of mild orthopedic pain. We also recommend inclusion of facial expressions in pain assessment.

Author(s):  
Sanjay Kumar Singh ◽  
V. Rastogi ◽  
S. K. Singh

Pain, assumed to be the fifth vital sign, is an important symptom that needs to be adequately assessed in heath care. The visual changes reflected on the face of a person in pain may be apparent for only a few seconds and occur instinctively. Tracking these changes is a difficult and time-consuming process in a clinical setting. This is why it is motivating researchers and experts from medical, psychology and computer fields to conduct inter-disciplinary research in capturing facial expressions. This chapter contains a comprehensive review of technologies in the study of facial expression along with its application in pain assessment. The facial expressions of pain in children's (0-2 years) and in non-communicative patients need to be recognized as they are of utmost importance for proper diagnosis. Well designed computerized methodologies would streamline the process of patient assessment, increasing its accessibility to physicians and improving quality of care.


Author(s):  
Sanjay Kumar Singh ◽  
V. Rastogi ◽  
S. K. Singh

Pain, assumed to be the fifth vital sign, is an important symptom that needs to be adequately assessed in heath care. The visual changes reflected on the face of a person in pain may be apparent for only a few seconds and occur instinctively. Tracking these changes is a difficult and time-consuming process in a clinical setting. This is why it is motivating researchers and experts from medical, psychology and computer fields to conduct inter-disciplinary research in capturing facial expressions. This chapter contains a comprehensive review of technologies in the study of facial expression along with its application in pain assessment. The facial expressions of pain in children's (0-2 years) and in non-communicative patients need to be recognized as they are of utmost importance for proper diagnosis. Well designed computerized methodologies would streamline the process of patient assessment, increasing its accessibility to physicians and improving quality of care.


2020 ◽  
Author(s):  
◽  
P. A. S. O. Silva

Pain analysis in newborns has become a relevant study subject over the last few decades, given the inability to objectively identify the source and intensity of the pain in newborn babies. Over the last few years, several methods for pain detection and evaluation were able to classify pain levels using facial expressions from newborn babies, through statistical models, machine learning and deep learning. Considering this context, health professionals are increasingly more interested in having computerized tools at their disposal. These tools would not only be able to accurately rank the newborn’s potential pain level, but also identify the facial regions of greatest relevance for a particular pain phenomenon. This dissertation’s main objective is to develop a computer framework capable of recognizing and interpreting patterns in facial expressions for an automated evaluation of pain levels on term babies. Specifically, this dissertation focuses on the investigation, implementation and integration of a series of techniques, including image detection and segmentation, spacial normalization and, ultimately, the classification of facial expressions based on information obtained through statistical data mining. Finally, the framework developed here, evaluated with an accuracy (upper limit) of approximately 96% for the COPE base and 77% for the UNIFESP base, reveal that it is possible to not only rank pain levels statistically through images of facial expressions, but also to identify key facial regions for certain pain phenomena, therefore assisting in creating more general and accurate pediatric pain scales


Animals ◽  
2020 ◽  
Vol 10 (9) ◽  
pp. 1610
Author(s):  
Johannes van Loon ◽  
Nicole Verhaar ◽  
Els van den Berg ◽  
Sarah Ross ◽  
Janny de Grauw

Pain assessment is very important for monitoring welfare and quality of life in horses. To date, no studies have described pain scales for objective assessment of pain in foals. Studies in other species have shown that facial expression can be used in neonatal animals for objective assessment of acute pain. The aim of the current study was to adapt a facial expression-based pain scale for assessment of acute pain in mature horses for valid pain assessment in foals. The scale was applied to fifty-nine foals (20 patients and 39 healthy controls); animals were assessed from video recordings (30–60 s) by 3 observers, who were blinded for the condition of the animals. Patients were diagnosed with acute health problems by means of clinical examination and additional diagnostic procedures. EQUUS-FAP FOAL (Equine Utrecht University Scale for Facial Assessment of Pain in Foals) showed good inter- and intra-observer reliability (Cronbach’s alpha = 0.95 and 0.98, p < 0.001). Patients had significantly higher pain scores compared to controls (p < 0.001) and the pain scores decreased after treatment with NSAIDs (meloxicam or flunixin meglumine IV) (p < 0.05). Our results indicate that a facial expression-based pain scale could be useful for the assessment of acute pain in foals. Further studies are needed to validate this pain scale.


2020 ◽  
Author(s):  
Jonathan Yi ◽  
Philip Pärnamets ◽  
Andreas Olsson

Responding appropriately to others’ facial expressions is key to successful social functioning. Despite the large body of work on face perception and spontaneous responses to static faces, little is known about responses to faces in dynamic, naturalistic situations, and no study has investigated how goal directed responses to faces are influenced by learning during dyadic interactions. To experimentally model such situations, we developed a novel method based on online integration of electromyography (EMG) signals from the participants’ face (corrugator supercilii and zygomaticus major) during facial expression exchange with dynamic faces displaying happy and angry facial expressions. Fifty-eight participants learned by trial-and-error to avoid receiving aversive stimulation by either reciprocate (congruently) or respond opposite (incongruently) to the expression of the target face. Our results validated our method, showing that participants learned to optimize their facial behavior, and replicated earlier findings of faster and more accurate responses in congruent vs. incongruent conditions. Moreover, participants performed better on trials when confronted with smiling, as compared to frowning, faces, suggesting it might be easier to adapt facial responses to positively associated expressions. Finally, we applied drift diffusion and reinforcement learning models to provide a mechanistic explanation for our findings which helped clarifying the underlying decision-making processes of our experimental manipulation. Our results introduce a new method to study learning and decision-making in facial expression exchange, in which there is a need to gradually adapt facial expression selection to both social and non-social reinforcements.


2020 ◽  
Author(s):  
Joshua W Maxwell ◽  
Eric Ruthruff ◽  
michael joseph

Are facial expressions of emotion processed automatically? Some authors have not found this to be the case (Tomasik et al., 2009). Here we revisited the question with a novel experimental logic – the backward correspondence effect (BCE). In three dual-task studies, participants first categorized a sound (Task 1) and then indicated the location of a target face (Task 2). In Experiment 1, Task 2 required participants to search for one facial expression of emotion (angry or happy). We observed positive BCEs, indicating that facial expressions of emotion bypassed the central attentional bottleneck and thus were processed in a capacity-free, automatic manner. In Experiment 2, we replicated this effect but found that morphed emotional expressions (which were used by Tomasik) were not processed automatically. In Experiment 3, we observed similar BCEs for another type of face processing previously shown to be capacity-free – identification of familiar faces (Jung et al., 2013). We conclude that facial expressions of emotion are identified automatically when sufficiently unambiguous.


Healthcare ◽  
2021 ◽  
Vol 9 (2) ◽  
pp. 171
Author(s):  
Vera Olisarova ◽  
Valerie Tothova ◽  
Martin Cerveny ◽  
Vendula Dvorakova ◽  
Petr Sadilek

Pain is a medical and nursing problem that is common in surgical departments. Inadequate pain management can lead to patient distress, as well as extending the period in which the patient’s quality of life is reduced. The standardized SF-MPQ-2 questionnaire provides nurses with the opportunity to assess pain within a broader context. The aim of this descriptive and exploratory study was to describe the state of pain assessment in surgical patients in the South Bohemian Region and to highlight the benefits of using a standardized tool for proper pain assessment. The research was carried out using a quantitative survey within the South Bohemian Region (Czech Republic). The participants in the study were nurses working in surgical departments in hospitals in the region as well as hospitalized patients. The results show that nurses pay slightly more attention to pain assessments than doctors. We know that, generally, pain decreases with time after surgery. Nonetheless, returning pain, as well as continuous pain, can occur, both of which have an emotional component. The results of this study are directed at nurses and include a call for more effective pain management through improved assessment.


Sensors ◽  
2021 ◽  
Vol 21 (12) ◽  
pp. 3956
Author(s):  
Youngsun Kong ◽  
Hugo F. Posada-Quintero ◽  
Ki H. Chon

The subjectiveness of pain can lead to inaccurate prescribing of pain medication, which can exacerbate drug addiction and overdose. Given that pain is often experienced in patients’ homes, there is an urgent need for ambulatory devices that can quantify pain in real-time. We implemented three time- and frequency-domain electrodermal activity (EDA) indices in our smartphone application that collects EDA signals using a wrist-worn device. We then evaluated our computational algorithms using thermal grill data from ten subjects. The thermal grill delivered a level of pain that was calibrated for each subject to be 8 out of 10 on a visual analog scale (VAS). Furthermore, we simulated the real-time processing of the smartphone application using a dataset pre-collected from another group of fifteen subjects who underwent pain stimulation using electrical pulses, which elicited a VAS pain score level 7 out of 10. All EDA features showed significant difference between painless and pain segments, termed for the 5-s segments before and after each pain stimulus. Random forest showed the highest accuracy in detecting pain, 81.5%, with 78.9% sensitivity and 84.2% specificity with leave-one-subject-out cross-validation approach. Our results show the potential of a smartphone application to provide near real-time objective pain detection.


2021 ◽  
pp. 174702182199299
Author(s):  
Mohamad El Haj ◽  
Emin Altintas ◽  
Ahmed A Moustafa ◽  
Abdel Halim Boudoukha

Future thinking, which is the ability to project oneself forward in time to pre-experience an event, is intimately associated with emotions. We investigated whether emotional future thinking can activate emotional facial expressions. We invited 43 participants to imagine future scenarios, cued by the words “happy,” “sad,” and “city.” Future thinking was video recorded and analysed with a facial analysis software to classify whether facial expressions (i.e., happy, sad, angry, surprised, scared, disgusted, and neutral facial expression) of participants were neutral or emotional. Analysis demonstrated higher levels of happy facial expressions during future thinking cued by the word “happy” than “sad” or “city.” In contrast, higher levels of sad facial expressions were observed during future thinking cued by the word “sad” than “happy” or “city.” Higher levels of neutral facial expressions were observed during future thinking cued by the word “city” than “happy” or “sad.” In the three conditions, the neutral facial expressions were high compared with happy and sad facial expressions. Together, emotional future thinking, at least for future scenarios cued by “happy” and “sad,” seems to trigger the corresponding facial expression. Our study provides an original physiological window into the subjective emotional experience during future thinking.


2021 ◽  
Vol 11 (4) ◽  
pp. 1428
Author(s):  
Haopeng Wu ◽  
Zhiying Lu ◽  
Jianfeng Zhang ◽  
Xin Li ◽  
Mingyue Zhao ◽  
...  

This paper addresses the problem of Facial Expression Recognition (FER), focusing on unobvious facial movements. Traditional methods often cause overfitting problems or incomplete information due to insufficient data and manual selection of features. Instead, our proposed network, which is called the Multi-features Cooperative Deep Convolutional Network (MC-DCN), maintains focus on the overall feature of the face and the trend of key parts. The processing of video data is the first stage. The method of ensemble of regression trees (ERT) is used to obtain the overall contour of the face. Then, the attention model is used to pick up the parts of face that are more susceptible to expressions. Under the combined effect of these two methods, the image which can be called a local feature map is obtained. After that, the video data are sent to MC-DCN, containing parallel sub-networks. While the overall spatiotemporal characteristics of facial expressions are obtained through the sequence of images, the selection of keys parts can better learn the changes in facial expressions brought about by subtle facial movements. By combining local features and global features, the proposed method can acquire more information, leading to better performance. The experimental results show that MC-DCN can achieve recognition rates of 95%, 78.6% and 78.3% on the three datasets SAVEE, MMI, and edited GEMEP, respectively.


Sign in / Sign up

Export Citation Format

Share Document