scholarly journals Keeping an eye on gestures: Visual perception of gestures in face-to-face communication

1999 ◽  
Vol 7 (1) ◽  
pp. 35-63 ◽  
Author(s):  
Marianne Gullberg ◽  
Kenneth Holmqvist

Since listeners usually look at the speaker's face, gestural information has to be absorbed through peripheral visual perception. In the literature, it has been suggested that listeners look at gestures under certain circumstances: 1) when the articulation of the gesture is peripheral; 2) when the speech channel is insufficient for comprehension; and 3) when the speaker him- or herself indicates that the gesture is worthy of attention. The research here reported employs eye tracking techniques to study the perception of gestures in face-to-face interaction. The improved control over the listener's visual channel allows us to test the validity of the above claims. We present preliminary findings substantiating claims 1 and 3, and relate them to theoretical proposals in the literature and to the issue of how visual and cognitive attention are related.

2021 ◽  
Vol 11 (1) ◽  
Author(s):  
James Trujillo ◽  
Asli Özyürek ◽  
Judith Holler ◽  
Linda Drijvers

AbstractIn everyday conversation, we are often challenged with communicating in non-ideal settings, such as in noise. Increased speech intensity and larger mouth movements are used to overcome noise in constrained settings (the Lombard effect). How we adapt to noise in face-to-face interaction, the natural environment of human language use, where manual gestures are ubiquitous, is currently unknown. We asked Dutch adults to wear headphones with varying levels of multi-talker babble while attempting to communicate action verbs to one another. Using quantitative motion capture and acoustic analyses, we found that (1) noise is associated with increased speech intensity and enhanced gesture kinematics and mouth movements, and (2) acoustic modulation only occurs when gestures are not present, while kinematic modulation occurs regardless of co-occurring speech. Thus, in face-to-face encounters the Lombard effect is not constrained to speech but is a multimodal phenomenon where the visual channel carries most of the communicative burden.


2014 ◽  
Vol 2 (3) ◽  
pp. 343-359 ◽  
Author(s):  
O. Kaminska ◽  
T. Foulsham
Keyword(s):  

2019 ◽  
Author(s):  
João Vitor Macedo Romera ◽  
Rafael Nobre Orsi ◽  
Rodrigo Filev Maia ◽  
Carlos Eduardo Thomaz

This work investigates reading patterns based on effects of the Meares-Irlen Syndrome (SMI), a visual-perception deficit that affects indirectly our cognitive system. The most common symptoms related to SMI in reading tasks are visual stress, sensation of moving letters and distortions in the text. These effects have been computationally simulated here and using eye-tracking information of a number of participants we have been able to linearly classify each effects with high accuracy.


2021 ◽  
Author(s):  
Zhong Zhao ◽  
Haiming Tang ◽  
Xiaobin Zhang ◽  
Xingda Qu ◽  
Jianping Lu

BACKGROUND Abnormal gaze behavior is a prominent feature of the autism spectrum disorder (ASD). Previous eye tracking studies had participants watch images (i.e., picture, video and webpage), and the application of machine learning (ML) on these data showed promising results in identify ASD individuals. Given the fact that gaze behavior differs in face-to-face interaction from image viewing tasks, no study has investigated whether natural social gaze behavior could accurately identify ASD. OBJECTIVE The objective of this study was to examine whether and what area of interest (AOI)-based features extracted from the natural social gaze behavior could identify ASD. METHODS Both children with ASD and typical development (TD) were eye-tracked when they were engaged in a face-to-face conversation with an interviewer. Four ML classifiers (support vector machine, SVM; linear discriminant analysis, LDA; decision tree, DT; and random forest, RF) were used to determine the maximum classification accuracy and the corresponding features. RESULTS A maximum classification accuracy of 84.62% were achieved with three classifiers (LDA, DT and RF). Results showed that the mouth, but not the eyes AOI, was a powerful feature in detecting ASD. CONCLUSIONS Natural gaze behavior could be leveraged to identify ASD, suggesting that ASD might be objectively screened with eye tracking technology in everyday social interaction. In addition, the comparison between our and previous findings suggests that eye tracking features that could identify ASD might be culture dependent and context sensitive.


2016 ◽  
Vol 8 (2) ◽  
pp. 74
Author(s):  
Gufran Ahmad

<p>Research studies on eye movements in area of information processing task, such as scene perception have recently advanced towards understandings of underlying visual perception mechanism and human cognitive dynamics. Besides, business applications of eye tracking are endlessly revealing groundbreaking trends based on practical scenarios. In this study, we conducted a number of eye tracking experiments to establish our hypothesis that the eye gazes based on the associative relevance found within the contexts of scenes during scene perception significantly supported the processes of decision making. The collected eye movement data from participants who viewed artistic scenes discovered that the tracks of eye gazes traversed along the existing associative relevance among the elements of scenes for decision making processes. These experimental evidences confirmed our hypothesis that the eye gazes based on associative relevance assisted in decision making processes during scene perception.</p>


Sign in / Sign up

Export Citation Format

Share Document