scholarly journals Goats prefer positive human emotional facial expressions

2018 ◽  
Vol 5 (8) ◽  
pp. 180491 ◽  
Author(s):  
Christian Nawroth ◽  
Natalia Albuquerque ◽  
Carine Savalli ◽  
Marie-Sophie Single ◽  
Alan G. McElligott

Domestication has shaped the physiology and the behaviour of animals to better adapt to human environments. Therefore, human facial expressions may be highly informative for animals domesticated for working closely with people, such as dogs and horses. However, it is not known whether other animals, and particularly those domesticated primarily for production, such as goats, are capable of perceiving human emotional cues. In this study, we investigated whether goats can distinguish human facial expressions when simultaneously shown two images of an unfamiliar human with different emotional valences (positive/happy or negative/angry). Both images were vertically attached to a wall on one side of a test arena, 1.3 m apart, and goats were released from the opposite side of the arena (distance of 4.0 m) and were free to explore and interact with the stimuli during the trials. Each of four test trials lasted 30 s. Overall, we found that goats preferred to interact first with happy faces, meaning that they are sensitive to human facial emotional cues. Goats interacted first, more often and for longer duration with positive faces when they were positioned on the right side. However, no preference was found when the positive faces were placed on the left side. We show that animals domesticated for production can discriminate human facial expressions with different emotional valences and prefer to interact with positive ones. Therefore, the impact of domestication on animal cognitive abilities may be more far-reaching than previously assumed.

2009 ◽  
Vol 21 (7) ◽  
pp. 1321-1331 ◽  
Author(s):  
Martin Lotze ◽  
Matthias Reimold ◽  
Ulrike Heymans ◽  
Arto Laihinen ◽  
Marianne Patt ◽  
...  

Recent findings point to a perceptive impairment of emotional facial expressions in patients diagnosed with Parkinson disease (PD). In these patients, administration of dopamine can modulate emotional facial recognition. We used fMRI to investigate differences in the functional activation in response to emotional and nonemotional gestures between PD patients and age-matched healthy controls (HC). In addition, we used PET to evaluate the striatal dopamine transporter availability (DAT) with [11C]d-threo-methylphenidate in the patient group. Patients showed an average decrease to 26% in DAT when compared to age-corrected healthy references. Reduction in the DAT of the left putamen correlated not only with motor impairment but also with errors in emotional gesture recognition. In comparison to HC, PD patients showed a specific decrease in activation related to emotional gesture observation in the left ventrolateral prefrontal cortex (VLPFC) and the right superior temporal sulcus. Moreover, the less DAT present in the left putamen, the lower the activation in the left VLPFC. We conclude that a loss of dopaminergic neurotransmission in the putamen results in a reduction of ventrolateral prefrontal access involved in the recognition of emotional gestures.


Author(s):  
Izabela Krejtz ◽  
Krzysztof Krejtz ◽  
Katarzyna Wisiecka ◽  
Marta Abramczyk ◽  
Michał Olszanowski ◽  
...  

Abstract The enhancement hypothesis suggests that deaf individuals are more vigilant to visual emotional cues than hearing individuals. The present eye-tracking study examined ambient–focal visual attention when encoding affect from dynamically changing emotional facial expressions. Deaf (n = 17) and hearing (n = 17) individuals watched emotional facial expressions that in 10-s animations morphed from a neutral expression to one of happiness, sadness, or anger. The task was to recognize emotion as quickly as possible. Deaf participants tended to be faster than hearing participants in affect recognition, but the groups did not differ in accuracy. In general, happy faces were more accurately and more quickly recognized than faces expressing anger or sadness. Both groups demonstrated longer average fixation duration when recognizing happiness in comparison to anger and sadness. Deaf individuals directed their first fixations less often to the mouth region than the hearing group. During the last stages of emotion recognition, deaf participants exhibited more focal viewing of happy faces than negative faces. This pattern was not observed among hearing individuals. The analysis of visual gaze dynamics, switching between ambient and focal attention, was useful in studying the depth of cognitive processing of emotional information among deaf and hearing individuals.


Animals ◽  
2020 ◽  
Vol 10 (1) ◽  
pp. 164 ◽  
Author(s):  
Anne Schrimpf ◽  
Marie-Sophie Single ◽  
Christian Nawroth

Dogs and cats use human emotional information directed to an unfamiliar situation to guide their behavior, known as social referencing. It is not clear whether other domestic species show similar socio-cognitive abilities in interacting with humans. We investigated whether horses (n = 46) use human emotional information to adjust their behavior to a novel object and whether the behavior of horses differed depending on breed type. Horses were randomly assigned to one of two groups: an experimenter positioned in the middle of a test arena directed gaze and voice towards the novel object with either (a) a positive or (b) a negative emotional expression. The duration of subjects’ position to the experimenter and the object in the arena, frequency of gazing behavior, and physical interactions (with either object or experimenter) were analyzed. Horses in the positive condition spent more time between the experimenter and object compared to horses in the negative condition, indicating less avoidance behavior towards the object. Horses in the negative condition gazed more often towards the object than horses in the positive condition, indicating increased vigilance behavior. Breed types differed in their behavior: thoroughbreds showed less human-directed behavior than warmbloods and ponies. Our results provide evidence that horses use emotional cues from humans to guide their behavior towards novel objects.


2020 ◽  
Vol 51 (5) ◽  
pp. 685-711
Author(s):  
Alexandra Sierra Rativa ◽  
Marie Postma ◽  
Menno Van Zaanen

Background. Empathic interactions with animated game characters can help improve user experience, increase immersion, and achieve better affective outcomes related to the use of the game. Method. We used a 2x2 between-participant design and a control condition to analyze the impact of the visual appearance of a virtual game character on empathy and immersion. The four experimental conditions of the game character appearance were: Natural (virtual animal) with expressiveness (emotional facial expressions), natural (virtual animal) with non-expressiveness (without emotional facial expressions), artificial (virtual robotic animal) with expressiveness (emotional facial expressions), and artificial (virtual robotic animal) with non-expressiveness (without emotional facial expressions). The control condition contained a baseline amorphous game character. 100 participants between 18 to 29 years old (M=22.47) were randomly assigned to one of five experimental groups. Participants originated from several countries: Aruba (1), China (1), Colombia (3), Finland (1), France (1), Germany (1), Greece (2), Iceland (1), India (1), Iran (1), Ireland (1), Italy (3), Jamaica (1), Latvia (1), Morocco (3), Netherlands (70), Poland (1), Romania (2), Spain (1), Thailand (1), Turkey (1), United States (1), and Vietnam (1). Results. We found that congruence in appearance and facial expressions of virtual animals (artificial + non-expressive and natural + expressive) leads to higher levels of self-reported situational empathy and immersion of players in a simulated environment compared to incongruent appearance and facial expressions. Conclusions. The results of this investigation showed an interaction effect between artificial/natural body appearance and facial expressiveness of a virtual character’s appearance. The evidence from this study suggests that the appearance of the virtual animal has an important influence on user experience.


Author(s):  
Chiara Ferrari ◽  
Lucile Gamond ◽  
Marcello Gallucci ◽  
Tomaso Vecchi ◽  
Zaira Cattaneo

Abstract. Converging neuroimaging and patient data suggest that the dorsolateral prefrontal cortex (DLPFC) is involved in emotional processing. However, it is still not clear whether the DLPFC in the left and right hemisphere is differentially involved in emotion recognition depending on the emotion considered. Here we used transcranial magnetic stimulation (TMS) to shed light on the possible causal role of the left and right DLPFC in encoding valence of positive and negative emotional facial expressions. Participants were required to indicate whether a series of faces displayed a positive or negative expression, while TMS was delivered over the right DLPFC, the left DLPFC, and a control site (vertex). Interfering with activity in both the left and right DLPFC delayed valence categorization (compared to control stimulation) to a similar extent irrespective of emotion type. Overall, we failed to demonstrate any valence-related lateralization in the DLPFC by using TMS. Possible methodological limitations are discussed.


2011 ◽  
Vol 41 (11) ◽  
pp. 2375-2384 ◽  
Author(s):  
F. Ashworth ◽  
A. Pringle ◽  
R. Norbury ◽  
C. J. Harmer ◽  
P. J. Cowen ◽  
...  

BackgroundProcessing emotional facial expressions is of interest in eating disorders (EDs) as impairments in recognizing and understanding social cues might underlie the interpersonal difficulties experienced by these patients. Disgust and anger are of particular theoretical and clinical interest. The current study investigated the neural response to facial expressions of anger and disgust in bulimia nervosa (BN).MethodParticipants were 12 medication-free women with BN in an acute episode (mean age 24 years), and 16 age-, gender- and IQ-matched healthy volunteers (HVs). Functional magnetic resonance imaging (fMRI) was used to examine neural responses to angry and disgusted facial expressions.ResultsCompared with HVs, patients with BN had a decreased neural response in the precuneus to facial expressions of both anger and disgust and a decreased neural response to angry facial expressions in the right amygdala.ConclusionsThe neural response to emotional facial expressions in BN differs from that found in HVs. The precuneus response may be consistent with the application of mentalization theory to EDs, and the amygdala response with relevant ED theory. The findings are preliminary, but novel, and require replication in a larger sample.


2021 ◽  
Vol 11 (9) ◽  
pp. 1203 ◽  
Author(s):  
Sara Borgomaneri ◽  
Francesca Vitale ◽  
Simone Battaglia ◽  
Alessio Avenanti

The ability to rapidly process others’ emotional signals is crucial for adaptive social interactions. However, to date it is still unclear how observing emotional facial expressions affects the reactivity of the human motor cortex. To provide insights on this issue, we employed single-pulse transcranial magnetic stimulation (TMS) to investigate corticospinal motor excitability. Healthy participants observed happy, fearful and neutral pictures of facial expressions while receiving TMS over the left or right motor cortex at 150 and 300 ms after picture onset. In the early phase (150 ms), we observed an enhancement of corticospinal excitability for the observation of happy and fearful emotional faces compared to neutral expressions specifically in the right hemisphere. Interindividual differences in the disposition to experience aversive feelings (personal distress) in interpersonal emotional contexts predicted the early increase in corticospinal excitability for emotional faces. No differences in corticospinal excitability were observed at the later time (300 ms) or in the left M1. These findings support the notion that emotion perception primes the body for action and highlights the role of the right hemisphere in implementing a rapid and transient facilitatory response to emotional arousing stimuli, such as emotional facial expressions.


2021 ◽  
Author(s):  
Harisu Abdullahi Shehu ◽  
Will N. Browne ◽  
Hedwig Eisenbarth

Partial face coverings such as sunglasses and facemasks have now become the ‘new norm’, especially since the increase of infectious diseases. Unintentionally, they obscure facial expressions. Therefore, humans and artificial systems have been found to be less accurate in emotion categorization. However, it is unknown how similar the performance of humans compared with artificial systems is affected based on the exact same stimuli, varying systematically in types of coverings. Such a systematic direct comparison would allow conclusions about the relevant facial features in a naturalistic context. Therefore, we investigated the impact of facemasks and sunglasses on the ability to categorize emotional facial expressions in humans and artificial systems. Artificial systems, represented by the VGG19 deep learning algorithm, and humans assessed images of people with varying emotional facial expressions and with four different types of coverings, i.e. unmasked (original images), mask (mask covering lower-face), partial mask (with transparent mouth window), and sunglasses. Artificial systems performed significantly better than humans when no covering is present (> 15% difference). However, the achieved accuracy of both humans and artificial systems differed significantly depending on the type of coverings and, importantly, emotion, e.g. the use of sunglasses reduced accuracy for recognition of fear in humans. It was also noted that while humans mainly classify unknown expressions as neutral across all coverings, the misclassification varied in the artificial systems. These findings show humans and artificial systems classify and misclassify various emotion expressions differently depending on both the type of face covering and type of emotion.


Author(s):  
Maida Koso-Drljević ◽  
Meri Miličević

The aim of the study was to test two assumptions about the lateralization of the processing of emotional facial expressions: the assumption of right hemisphere dominance and the valence assumption and to egsamine the influence of gender of the presented stimulus (chimera) and depression as an emotional state of participants. The sample consisted of 83 female students, with an average age of 20 years. Participants solved the Task of Recognizing Emotional Facial Expressions on a computer and then completed the DASS-21, Depression subscale. The results of the study partially confirmed the assumption of valence for the dependent variable - the accuracy of the response. Participants were recognizing more accurately the emotion of sadness than happiness when it is presented on the left side of the face, which is consistent with the valence hypothesis, according to which the right hemisphere is responsible for recognizing negative emotions. However, when it comes to the right side of the face, participants were equally accurately recognizing the emotion of sadness and happiness, which is not consistent with the valence hypothesis. The main effect of the gender of the chimera was statistically significant for the accuracy of the response, the recognition accuracy was higher for the male chimeras compared to the female. A statistically significant negative correlation was obtained between the variable sides of the face (left and right) with the achieved result on the depression subscale for the dependent variable - reaction time. The higher the score on the depressive subscale, the slower (longer) is reaction time to the presented chimera, both on the left and on the right.


2014 ◽  
pp. 1-6 ◽  
Author(s):  
Juan F. Cardona ◽  
Vladimiro Sinay ◽  
Lucia Amoruso ◽  
Eugenia Hesse ◽  
Facundo Manes ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document