Prototypicality and Intensity of Emotional Faces using an Anchor-Point Method

2013 ◽  
Vol 16 ◽  
Author(s):  
Álvaro Sánchez ◽  
Carmelo Vázquez

AbstractEmotional faces are commonly used as stimuli in a wide number of research fields. The present study provides values of 198 pictures from one of the amplest available face databases, the Karolinska Directed Emotional Faces (KDEF). We used a new validation strategy that consisted of presenting pairs of faces which included an emotional face (i.e., angry, happy, sad) and its corresponding neutral face from the same model. This design allowed participants to keep a comparison face (i.e., neutral) as a constant anchor point to evaluate parameters on each emotional expression presented. Raters were asked to judge both the prototypicality of the emotional expressions (i.e., the degree to which they represent their corresponding emotional prototypes) as well as their emotional intensity. We finally discuss the potential advantages of this anchor-point method as a system to elicit judgments on facial emotional expressions.

2009 ◽  
Vol 40 (6) ◽  
pp. 911-919 ◽  
Author(s):  
E. Pomarol-Clotet ◽  
F. Hynes ◽  
C. Ashwin ◽  
E. T. Bullmore ◽  
P. J. McKenna ◽  
...  

BackgroundIdentification of facial emotions has been found to be impaired in schizophrenia but there are uncertainties about the neuropsychological specificity of the finding.MethodTwenty-two patients with schizophrenia and 20 healthy controls were given tests requiring identification of facial emotion, judgement of the intensity of emotional expressions without identification, familiar face recognition and the Benton Facial Recognition Test (BFRT). The schizophrenia patients were selected to be relatively intellectually preserved.ResultsThe patients with schizophrenia showed no deficit in identifying facial emotion, although they were slower than the controls. They were, however, impaired on judging the intensity of emotional expression without identification. They showed impairment in recognizing familiar faces but not on the BFRT.ConclusionsWhen steps are taken to reduce the effects of general intellectual impairment, there is no deficit in identifying facial emotions in schizophrenia. There may, however, be a deficit in judging emotional intensity. The impairment found in naming familiar faces is consistent with other evidence of semantic memory impairment in the disorder.


2018 ◽  
Vol 4 (1) ◽  
Author(s):  
Jonathan C. Corbin ◽  
L. Elizabeth Crawford

An emotional expression can be misremembered as more similar to previously seen expressions than it actually was – demonstrating inductive category effects for emotional expressions. Given that memory is influenced over time, we sought to determine whether memory for a single expression would be similarly influenced by other expressions viewed simultaneously. In other words, we test whether the ability to encode statistical features of an ensemble (i.e., ensemble encoding) is leveraged when attempting to recall a single expression from the ensemble. In three preregistered experiments, participants saw an ensemble of 4 expressions, one neutral and the three either happy or sad. After a delay, participants were asked to reproduce the neutral face by adjusting a response face’s expression. In Experiment 1, the ensemble comprised images of the same actor; in Experiments 2 and 3, images were comprised of individuals varying race and gender. In each experiment we demonstrated that even after only a single exposure, memory for the neutral expression in the happy group was biased happier relative to the same expression in the sad group. Data and syntax can be found at https://osf.io/gcbez/.


2018 ◽  
Author(s):  
Andras N. Zsido ◽  
Virag Ihasz ◽  
Annekathrin Schacht ◽  
Nikolett Arato ◽  
Orsolya Inhof ◽  
...  

Previous studies investigating the advantage of emotional expressions in visual processing in preschool children only used adult faces. However, children perceive facial expression of emotions differently when displayed on adults’ compared to children’s faces. In the present study, pre-schoolers (N=43, Mean age=5.65) and adults (N=37, Mean age=21.8) had to find a target face displaying an emotional expression among eight neutral faces. Gender of the faces (boy and girl) were also manipulated. Happy faces were found the fastest across both samples. Children detected the angry face faster than the fearful one, while adults vice versa. However, an interaction in the adult sample suggests that this is only true for girls’ faces, while the difference was nonsignificant for boys’ faces. In both samples, the detection was faster with boys’ faces compare to girls’ for all emotions. It is suggested that the happy face could have an advantage in visual processing due to its importance in social situations.


2021 ◽  
Vol 11 ◽  
Author(s):  
Aurélie Bochet ◽  
Martina Franchini ◽  
Nada Kojovic ◽  
Bronwyn Glaser ◽  
Marie Schaer

Diminished orienting to social stimuli, and particularly to faces, is a core feature of autism spectrum disorders (ASDs). Impaired face processing has been linked to atypical attention processes that trigger a cascade of pathological development contributing to impaired social communication. The aim of the present study is to explore the processing of emotional and neutral faces using an eye-tracking paradigm (the emotional faces task) with a group of 24 children with ASD aged 6 and under and a group of 22 age-matched typically developing (TD) children. We also measure habituation to faces in both groups based on the presentation of repeated facial expressions. Specifically, the task consists of 32 pairs of faces, a neutral face and an emotional face from the same identity, shown side by side on the screen. We observe differential exploration of emotional faces in preschoolers with ASD compared with TD. Participants with ASD make fewer fixations to emotional faces than their TD peers, and the duration of their first fixation on emotional faces is equivalent to their first fixation on neutral faces. These results suggest that emotional faces may be less interesting for children with ASD. We also observe a habituation process to neutral faces in both children with ASD and TD, who looked less at neutral faces during the last quarter of the task compared with the first quarter. By contrast, TD children show increased interest in emotional faces throughout the task, looking slightly more at emotional faces during the last quarter of the task than during the first quarter. Children with ASD demonstrate neither habituation nor increased interest in the changing emotional expressions over the course of the task, looking at the stimuli for equivalent time throughout the task. A lack of increased interest in emotional faces may suggest a lack of sensitivity to changes in expression in young children with ASD.


2015 ◽  
Vol 22 (12) ◽  
pp. 1123-1130 ◽  
Author(s):  
Orrie Dan ◽  
Sivan Raz

Objective: The present study investigated differences in emotional face processing between adolescents (age 15-18) with ADHD-Combined type (ADHD-CT) and typically developing controls. Method: Participants completed a visual emotional task in which they were asked to rate the degree of negativity/positivity of four facial expressions (taken from the NimStim face stimulus set). Results: Participants’ ratings, ratings’ variability, response times (RTs), and RTs’ variability were analyzed. Results showed a significant interaction between group and the type of presented stimuli. Adolescents with ADHD-CT discriminated less between positive and negative emotional expressions compared with those without ADHD. In addition, adolescents with ADHD-CT exhibited greater variability in their RTs and in their ratings of facial expressions when compared with controls. Conclusion: The present results lend further support to the existence of a specific deficit or alteration in the processing of emotional face stimuli among adolescents with ADHD-CT.


Author(s):  
Eleonora Cannoni ◽  
Giuliana Pinto ◽  
Anna Silvia Bombi

AbstractThis study was aimed at verifying if children introduce emotional expressions in their drawings of human faces, and if a preferential expression exists; we also wanted to verify if children’s pictorial choices change with increasing age. To this end we examined the human figure drawings made by 160 boys and 160 girls, equally divided in 4 age groups: 6–7; 8–9; 10–11; 12–13 years; mean ages (SD in parentheses) were: 83,30 (6,54); 106,14 (7,16) 130,49 (8,26); 155,40 (6,66). Drawings were collected with the Draw-a-Man test instructions, i.e. without mentioning an emotional characterization. In the light of data from previous studies of emotion drawing on request, and the literature about preferred emotional expressions, we expected that an emotion would be portrayed even by the younger participants, and that the preferred emotion would be happiness. We also expected that with the improving ability to keep into account both mouth and eyes appearance, other expressions would be found besides the smiling face. Data were submitted to non-parametric tests to compare the frequencies of expressions (absolute and by age) and the frequencies of visual cues (absolute and by age and expressions). The results confirmed that only a small number of faces were expressionless, and that the most frequent emotion was happiness. However, with increasing age this representation gave way to a variety of basic emotions (sadness, fear, anger, surprise), whose representation may depend from the ability to modify the shapes of both eyes and mouth and changing communicative aims of the child.


2017 ◽  
Vol 11 (1) ◽  
pp. 27-38 ◽  
Author(s):  
Caruana Fausto

A common view in affective neuroscience considers emotions as a multifaceted phenomenon constituted by independent affective and motor components. Such dualistic connotation, obtained by rephrasing the classic Darwin and James’s theories of emotion, leads to the assumption that emotional expression is controlled by motor centers in the anterior cingulate, frontal operculum, and supplementary motor area, whereas emotional experience depends on interoceptive centers in the insula. Recent stimulation studies provide a different perspective. I will outline two sets of findings. First, affective experiences can be elicited also following the stimulation of motor centers. Second, emotional expressions can be elicited by stimulating interoceptive regions. Echoing the original pragmatist theories of emotion, I will make a case for the notion that emotional experience emerges from the integration of sensory and motor signals, encoded in the same functional network.


2011 ◽  
Vol 1 (3) ◽  
pp. 381-400 ◽  
Author(s):  
Roque V. Mendez ◽  
Reiko Graham ◽  
Heidi Blocker ◽  
Janine Harlow ◽  
Adriana Campos

An empathy scale developed in Mexico (Diaz-Loving, Andrade –Palos & Nadelsticher-Mitrani, 1986) was translated and validated in a U.S. sample. The Mexican and Davis’ Interpersonal Reactivity Scales shared conceptually similar constructs. However, there were differences. In particular, a unique Mexican factor, Empatía Cognoscitiva and which we called Prescience had not been identified in empathy scales. It appeared to measure empathic accuracy, an individual’s purported knowledge of others’ feelings and moods. In a second study, we tested individuals’ sensitivity in detecting subtle changes in emotional expressions, and found that individuals who scored highly in this characteristic were not necessarily more accurate at detecting emotions, but took significantly more time to look at fearful and angry faces. The results of a third study suggest that this was not due to enhanced attentional capture by negative emotional faces. In a final study, we found that purported accuracy was based on self-presentational concerns. Validation of this factor provides a clearer understanding of its cognitive and motivational properties and future uses.


Author(s):  
Michela Balconi

Neuropsychological studies have underlined the significant presence of distinct brain correlates deputed to analyze facial expression of emotion. It was observed that some cerebral circuits were considered as specific for emotional face comprehension as a function of conscious vs. unconscious processing of emotional information. Moreover, the emotional content of faces (i.e. positive vs. negative; more or less arousing) may have an effect in activating specific cortical networks. Between the others, recent studies have explained the contribution of hemispheres in comprehending face, as a function of type of emotions (mainly related to the distinction positive vs. negative) and of specific tasks (comprehending vs. producing facial expressions). Specifically, ERPs (event-related potentials) analysis overview is proposed in order to comprehend how face may be processed by an observer and how he can make face a meaningful construct even in absence of awareness. Finally, brain oscillations is considered in order to explain the synchronization of neural populations in response to emotional faces when a conscious vs. unconscious processing is activated.


Sign in / Sign up

Export Citation Format

Share Document