scholarly journals Erzählen inszenieren:

2020 ◽  
Vol 104 (4) ◽  
pp. 59-81
Author(s):  
Maximilian Krug

Narrating is a crucial activity in theatre rehearsals. Through this activity, narratives are performed, expanded, reinterpreted, or even completely improvised. The communicative practices used by theatre professionals to develop a play as a theatrical narrative have rarely been researched, both in linguistics and theatre studies. Therefore, this paper addresses how actors, directors, and other members of a theatre production collectively develop monologues as self-contained narratives within a play. The research focuses on how narrators and listeners, as an interactional ensemble, use multimodal actions to realize such monologues. Surprisingly, the co-narrators don’t appear to imagine their future audience but construct the narrations in situ with and for the present members. This observation especially becomes evident when mobile eye-tracking glasses measure the co-narrators’ gaze behavior. It shows that members of a theatre rehearsal perform different activities (e. g., improvising, reading, prompting, instructing, discussing, monitoring) with regard to local interactional requirements. This paper illustrates the procedures with which theatre-makers produce monologues as multimodal narratives and highlights the differences that distinguish such narratives in theatre from spontaneous everyday storytellings.

2021 ◽  
Vol 11 (12) ◽  
pp. 5546
Author(s):  
Florian Heilmann ◽  
Kerstin Witte

Visual anticipation is essential for performance in sports. This review provides information on the differences between stimulus presentations and motor responses in eye-tracking studies and considers virtual reality (VR), a new possibility to present stimuli. A systematic literature search on PubMed, ScienceDirect, IEEE Xplore, and SURF was conducted. The number of studies examining the influence of stimulus presentation (in situ, video) is deficient but still sufficient to describe differences in gaze behavior. The seven reviewed studies indicate that stimulus presentations can cause differences in gaze behavior. Further research should focus on displaying game situations via VR. The advantages of a scientific approach using VR are experimental control and repeatability. In addition, game situations could be standardized and movement responses could be included in the analysis.


2021 ◽  
Vol 12 ◽  
Author(s):  
Ulrich Max Schaller ◽  
Monica Biscaldi ◽  
Anna Burkhardt ◽  
Christian Fleischhaker ◽  
Michael Herbert ◽  
...  

Face perception and emotion categorization are widely investigated under laboratory conditions that are devoid of real social interaction. Using mobile eye-tracking glasses in a standardized diagnostic setting while applying the Autism Diagnostic Observation Schedule (ADOS-2), we had the opportunity to record gaze behavior of children and adolescents with and without Autism Spectrum Conditions (ASCs) during social interaction. The objective was to investigate differences in eye-gaze behavior between three groups of children and adolescents either (1) with ASC or (2) with unconfirmed diagnosis of ASC or (3) with neurotypical development (NTD) during social interaction with an adult interviewer in a diagnostic standard situation using the ADOS-2. In a case control study, we used mobile eye-tracking glasses in an ecologically valid and highly standardized diagnostic interview to investigate suspected cases of ASC. After completion of the ASC diagnostic gold standard including the ADOS-2, the participants were assigned to two groups based on their diagnosis (ASC vs. non-ASC) and compared with a matched group of neurotypically developed controls. The primary outcome measure is the percentage of total dwell times assessed for different areas of interest (AOI) with regard to the face and body of a diagnostic interviewer and the surrounding space. Overall, 65 children and adolescents within an age range of 8.3–17.9 years were included in the study. The data revealed significant group differences, especially in the central-face area. Previous investigations under laboratory conditions gave preferential attention to the eye region during face perception to describe differences between ASC and NTD. In this study – using an ecologically valid setting within a standard diagnostic procedure – the results indicate that neurotypically developed controls seem to process faces and facial expressions in a holistic manner originating from the central-face region. Conversely, participants on the Autism Spectrum (tAS) seem to avoid the central-face region and show unsystematic gaze behavior, not using the preferred landing position in the central-face region as the Archimedean point of face perception. This study uses a new approach, and it will be important to replicate these preliminary findings in future research.


2015 ◽  
Vol 68 (1) ◽  
pp. 95-101 ◽  
Author(s):  
Erik Wästlund ◽  
Tobias Otterbring ◽  
Anders Gustafsson ◽  
Poja Shams

2020 ◽  
Author(s):  
David Harris ◽  
Mark Wilson ◽  
Tim Holmes ◽  
Toby de Burgh ◽  
Samuel James Vine

Head-mounted eye tracking has been fundamental for developing an understanding of sporting expertise, as the way in which performers sample visual information from the environment is a major determinant of successful performance. There is, however, a long running tension between the desire to study realistic, in-situ gaze behaviour and the difficulties of acquiring accurate ocular measurements in dynamic and fast-moving sporting tasks. Here, we describe how immersive technologies, such as virtual reality, offer an increasingly compelling approach for conducting eye movement research in sport. The possibility of studying gaze behaviour in representative and realistic environments, but with high levels of experimental control, could enable significant strides forward for eye tracking in sport and improve understanding of how eye movements underpin sporting skills. By providing a rationale for virtual reality as an optimal environment for eye tracking research, as well as outlining practical considerations related to hardware, software and data analysis, we hope to guide researchers and practitioners in the use of this approach.


2021 ◽  
pp. 1-16
Author(s):  
Leigha A. MacNeill ◽  
Xiaoxue Fu ◽  
Kristin A. Buss ◽  
Koraly Pérez-Edgar

Abstract Temperamental behavioral inhibition (BI) is a robust endophenotype for anxiety characterized by increased sensitivity to novelty. Controlling parenting can reinforce children's wariness by rewarding signs of distress. Fine-grained, dynamic measures are needed to better understand both how children perceive their parent's behaviors and the mechanisms supporting evident relations between parenting and socioemotional functioning. The current study examined dyadic attractor patterns (average mean durations) with state space grids, using children's attention patterns (captured via mobile eye tracking) and parental behavior (positive reinforcement, teaching, directives, intrusion), as functions of child BI and parent anxiety. Forty 5- to 7-year-old children and their primary caregivers completed a set of challenging puzzles, during which the child wore a head-mounted eye tracker. Child BI was positively correlated with proportion of parent's time spent teaching. Child age was negatively related, and parent anxiety level was positively related, to parent-focused/controlling parenting attractor strength. There was a significant interaction between parent anxiety level and child age predicting parent-focused/controlling parenting attractor strength. This study is a first step to examining the co-occurrence of parenting behavior and child attention in the context of child BI and parental anxiety levels.


2017 ◽  
Vol 1 (suppl_1) ◽  
pp. 1363-1363
Author(s):  
D.M. Isaacowitz ◽  
K.M. Livingstone ◽  
M.S. El-Nasr

2021 ◽  
Author(s):  
Zhong Zhao ◽  
Haiming Tang ◽  
Xiaobin Zhang ◽  
Xingda Qu ◽  
Jianping Lu

BACKGROUND Abnormal gaze behavior is a prominent feature of the autism spectrum disorder (ASD). Previous eye tracking studies had participants watch images (i.e., picture, video and webpage), and the application of machine learning (ML) on these data showed promising results in identify ASD individuals. Given the fact that gaze behavior differs in face-to-face interaction from image viewing tasks, no study has investigated whether natural social gaze behavior could accurately identify ASD. OBJECTIVE The objective of this study was to examine whether and what area of interest (AOI)-based features extracted from the natural social gaze behavior could identify ASD. METHODS Both children with ASD and typical development (TD) were eye-tracked when they were engaged in a face-to-face conversation with an interviewer. Four ML classifiers (support vector machine, SVM; linear discriminant analysis, LDA; decision tree, DT; and random forest, RF) were used to determine the maximum classification accuracy and the corresponding features. RESULTS A maximum classification accuracy of 84.62% were achieved with three classifiers (LDA, DT and RF). Results showed that the mouth, but not the eyes AOI, was a powerful feature in detecting ASD. CONCLUSIONS Natural gaze behavior could be leveraged to identify ASD, suggesting that ASD might be objectively screened with eye tracking technology in everyday social interaction. In addition, the comparison between our and previous findings suggests that eye tracking features that could identify ASD might be culture dependent and context sensitive.


Sign in / Sign up

Export Citation Format

Share Document