Affective Computing and Interaction
Latest Publications


TOTAL DOCUMENTS

16
(FIVE YEARS 0)

H-INDEX

4
(FIVE YEARS 0)

Published By IGI Global

9781616928926, 9781616928940

Author(s):  
Jonathan Sykes

Video-games, like movies, music and storybooks are emotional artifacts. We buy media to alter our affective state. Through consumption they impact our physiology and thus alter our affective world. In this chapter the authors review the ways in which playing games can elicit emotion. This chapter will discuss the increased power of video-game technology to elicit affect, and show how the mash-up of traditional and interactive techniques have delivered a richness of emotion that competes with film and television. They then conclude by looking forward to a time when video-games become the dominant medium, and the preferred choice when seeking that emotional fix.


Author(s):  
Magy Seif El-Nasr ◽  
Jacquelyn Ford Morie ◽  
Anders Drachen

The interactive entertainment industry has become a multi-billion dollar industry with revenues overcoming those of the movie industry (ESA, 2009). Beyond the demand for high fidelity graphics or stylized imagery, participants in these environments have come to expect certain aesthetic and artistic qualities that engage them at a very deep emotional level. These qualities pertain to the visual aesthetic, dramatic structure, pacing, and sensory systems embedded within the experience. All these qualities are carefully crafted by the creator of the interactive experience to evoke affect. In this book chapter, the authors will attempt to discuss the design techniques developed by artists to craft such emotionally engaging experiences. In addition, they take a scientific approach whereby we discuss case studies of the use of these design techniques and experiments that attempt to validate their use in stimulating emotions.


Author(s):  
Karla Conn Welch ◽  
Uttama Lahiri ◽  
Nilanjan Sarkar ◽  
Zachary Warren ◽  
Wendy Stone ◽  
...  

This chapter covers the application of affective computing using a physiological approach to children with Autism Spectrum Disorders (ASD) during human-computer interaction (HCI) and human-robot interaction (HRI). Investigation into technology-assisted intervention for children with ASD has gained momentum in recent years. Clinicians involved in interventions must overcome the communication impairments generally exhibited by children with ASD by adeptly inferring the affective cues of the children to adjust the intervention accordingly. Similarly, an intelligent system, such as a computer or robot, must also be able to understand the affective needs of these children - an ability that the current technology-assisted ASD intervention systems lack - to achieve effective interaction that addresses the role of affective states in HCI, HRI, and intervention practice.


Author(s):  
Yuuki Kato ◽  
Douglass J. Scott ◽  
Shogo Kato

This chapter focuses on the roles of interpersonal closeness and gender on the interpretation and sending of emotions in mobile phone email messages1. 91 Japanese college students were shown scenarios involving either a friend or an acquaintance describing situations intended to evoke one of four emotions: Happiness, sadness, anger, or guilt. The participants’ rated their emotions and composed replies for each scenario. Analysis revealed that in the happy and guilt scenarios, emotions experienced by the participants were conveyed to their partners almost without change. However, in the sad and angry scenarios, the emotions sent to the partners were weaker than the actual emotions experienced. Gender analysis showed that men were more likely to experience and express anger in the anger scenario, while women were more likely to experience and express sadness in the anger scenario. In addition, more women’s replies contained emotional expressions than did the men’s messages.


Author(s):  
N Korsten ◽  
JG Taylor

In order to achieve ‘affective computing’ it is necessary to know what is being computed. That is, in order to compute with what would pass for human emotions, it is necessary to have a computational basis for the emotions themselves. What does it mean quantitatively if a human is sad or angry? How is this affective state computed in their brain? It is this question, on the very core of the computational nature of the human emotions, which is addressed in this chapter. A proposal will be made as to this computational basis based on the well established approach to emotions as arising from an appraisal of a given situation or event by a specific human being.


Author(s):  
Matthias Scheutz

This chapter examines the utilization of affective control to support the survival of agents in competitive multi-agent environments. The author introduces simple affective control mechanisms for simple agents which result in high performance both in ordinary foraging tasks (e.g., searching for food) and in social encounters (e.g., competition for mates). In the proposed case, affective control via the transmission of simple signals can lead to social coordination. Therefore, this case prevents the need for more complex forms of communication like symbolic communication based on systematic representational schemata.


Author(s):  
Alessandro Vinciarelli ◽  
Gelareh Mohammadi

Nonverbal communication is the main channel through which we experience inner life of others, including their emotions, feelings, moods, social attitudes, etc. This attracts the interest of the computing community because nonverbal communication is based on cues like facial expressions, vocalizations, gestures, postures, etc. that we can perceive with our senses and can be (and often are) detected, analyzed and synthesized with automatic approaches. In other words, nonverbal communication can be used as a viable interface between computers and some of the most important aspects of human psychology such as emotions and social attitudes. As a result, a new computing domain seems to emerge that we can define “technology of nonverbal communication”. This chapter outlines some of the most salient aspects of such a potentially new domain and outlines some of its most important perspectives for the future.


Author(s):  
Annette Hohenberger

In this chapter, language development is discussed within a social-emotional framework. Children’s language processing is gated by social and emotional aspects of the interaction, such as affective prosodic and facial expression, contingent reactions, and joint attention. Infants and children attend to both cognitive and affective aspects in language perception (“language” vs. “paralanguage”) and in language production (“effort” vs. “engagement”). Deaf children acquiring a sign language go through the same developmental milestones in this respect. Modality-independently, a tripartite developmental sequence emerges: (i) an undifferentiated affect-dominated system governs the child’s behavior, (ii) a cognitive and language-dominated system emerges that attenuates the affective system, (iii) emotional expression is re-integrated into cognition and language. This tightly integrated cognitive-affective language system is characteristic of adults. Evolutionary scenarios are discussed that might underlie its ontogeny. The emotional context of learning might influence the course and outcome of L2-learning, too.


Author(s):  
Aysen Erdem ◽  
Serkan Karaismailoglu

Emotions embody goal-directed behavior for survival and adaptation through the perception of variations in the environment. At a physiological level, emotions consist of three complementary components: Physical sensation, emotional expression and subjective experience. At the level of anatomical structures though, trying to segregate distinct components is impossible. Our emotions are resulting products of compatible and coordinated cortical and sub-cortical neural mechanisms originating from several anatomical structures. In this chapter, an overview of the three physiological components and underlying anatomical constructs will be presented.


Author(s):  
Ioan Buciu ◽  
Ioan Nafornita ◽  
Cornelia Gordan

Living in a computer era, the synergy between man and machine is a must, as the computers are integrated into our everyday life. The computers are surrounding us but their interfaces are far from being friendly. One possible approach to create a friendlier human-computer interface is to build an emotion-sensitive machine that should be able to recognize a human facial expression with a satisfactory classification rate and, eventually, to synthesize an artificial facial expression onto embodied conversational agents (ECAs), defined as friendly and intelligent user interfaces built to mimic human gestures, speech or facial expressions. Computer scientists working in computer interfaces (HCI) put up impressive efforts to create a fully automatic system capable to identifying and generating photo - realistic human facial expressions through animation. This chapter aims at presenting current state-of-the-art techniques and approaches developed over time to deal with facial expression synthesis and animation. The topic’s importance will be further highlighted through modern applications including multimedia applications. The chapter ends up with discussions and open problems.


Sign in / Sign up

Export Citation Format

Share Document