61.2: Eye Movements During Visual and Auditory Task Performance

2004 ◽  
Vol 35 (1) ◽  
pp. 1582 ◽  
Author(s):  
Erik Viirre ◽  
Karl Van Orden ◽  
Shawn Wing ◽  
Bradley Chase ◽  
Christopher Pribe ◽  
...  
Author(s):  
Michael G Lenné ◽  
Benjamin L Hoggan ◽  
Justin Fidock ◽  
Geoff Stuart ◽  
Eugene Aidman

2020 ◽  
Vol 127 (3) ◽  
pp. 571-586
Author(s):  
Ikumi Tochikura ◽  
Daisuke Sato ◽  
Daiki Imoto ◽  
Atsuo Nuruki ◽  
Koya Yamashiro ◽  
...  

Previous studies have reported that baseball players have higher than average visual information processing abilities and outstanding motor control. The speed and position of the baseball and the batter are constantly changing, leading skilled players to acquire highly accurate visual information processing and decision-making. This study sought to clarify how movement of the eyes is associated with baseball players’ higher coincident-timing task performance. We recruited 15 right-handed baseball players and 15 age-matched track and field athletes. On a computer-based coincident-timing task, we instructed participants to stop a computer image of a moving target by pressing a button at a designated point. We presented bidirectional moving targets with various velocities, presented in a random order. The targets’ moving angular velocity varied between 100, 83, 71, 63, 56, 50, and 46 deg/s. We conducted 168 repetitions (42 reps × 4 sets) of this coincident-timing task and measured participants’ eye movements during the task using Pupil Centre Corneal Reflection. Mixed-design analysis of variance results revealed participant group effects in favor of baseball players for timing absolute error and low absolute error, as predicted from prior visual processing and decision-making research with baseball players. However, in contrast to prior research, we found significantly shorter smooth-pursuit onset latency in elite baseball players, and there were no significant group differences for saccade onset and offset latencies. This may be explained by the difference in our research paradigm with mobile targets randomly presented at various velocities from the left and right. Our data showed baseball players’ higher than normal simultaneous timing execution for making decisions and movements based on visual information, even under laboratory conditions with randomly moving mobile targets.


2020 ◽  
Vol 34 (6) ◽  
pp. 1430-1443
Author(s):  
Selina N. Emhardt ◽  
Margot Wermeskerken ◽  
Katharina Scheiter ◽  
Tamara Gog

2011 ◽  
Vol 31 (10) ◽  
pp. 3853-3861 ◽  
Author(s):  
J. D. Thorne ◽  
M. De Vos ◽  
F. C. Viola ◽  
S. Debener

2018 ◽  
Author(s):  
Elina A. K. Jacobs ◽  
Nicholas A. Steinmetz ◽  
Matteo Carandini ◽  
Kenneth D. Harris

Neocortical activity varies between states of “synchronization” and “desynchronization”, with desynchronized states believed to occur specifically in regions engaged by the task. To disambiguate whether desynchronization is linked to task performance or engagement, we trained mice on tasks in which incorrect responses due to disengagement (neglect) differed from inaccurate task performance (incorrect choices). Using widefield calcium imaging to measure cortical state across many areas simultaneously, we found that desynchronization was correlated with engagement rather than accuracy. Consistent with this link between desynchronization and engagement, we found that rewards had a long-lasting desynchronizing effect. To determine whether engagement-related changes in cortical state depended on the sensory modality, we trained mice on visual and auditory task versions and found that desynchronization was similar in both and more pronounced in somatomotor than either sensory cortex. We conclude that variations in cortical state are predominately global and closely relate to variations in task engagement.


2018 ◽  
Author(s):  
Rachel N. Denison ◽  
Shlomit Yuval-Greenberg ◽  
Marisa Carrasco

AbstractOur visual input is constantly changing, but not all moments are equally relevant. Temporal attention, the prioritization of visual information at specific points in time, increases perceptual sensitivity at behaviorally relevant times. The dynamic processes underlying this increase are unclear. During fixation, humans make small eye movements called microsaccades, and inhibiting microsaccades improves perception of brief stimuli. Here we asked whether temporal attention changes the pattern of microsaccades in anticipation of brief stimuli. Human observers (female and male) judged brief stimuli presented within a short sequence. They were given either an informative precue to attend to one of the stimuli, which was likely to be probed, or an uninformative (neutral) precue. We found strong microsaccadic inhibition before the stimulus sequence, likely due to its predictable onset. Critically, this anticipatory inhibition was stronger when the first target in the sequence (T1) was precued (task-relevant) than when the precue was uninformative. Moreover, the timing of the last microsaccade before T1 and the first microsaccade after T1 shifted, such that both occurred earlier when T1 was precued than when the precue was uninformative. Finally, the timing of the nearest pre- and post-T1 microsaccades affected task performance. Directing voluntary temporal attention therefore impacts microsaccades, helping to stabilize fixation at the most relevant moments, over and above the effect of predictability. Just as saccading to a relevant stimulus can be an overt correlate of the allocation of spatial attention, precisely timed gaze stabilization can be an overt correlate of the allocation of temporal attention.Significance statementWe pay attention at moments in time when a relevant event is likely to occur. Such temporal attention improves our visual perception, but how it does so is not well understood. Here we discovered a new behavioral correlate of voluntary, or goal-directed, temporal attention. We found that the pattern of small fixational eye movements called microsaccades changes around behaviorally relevant moments in a way that stabilizes the position of the eyes. Microsaccades during a brief visual stimulus can impair perception of that stimulus. Therefore, such fixation stabilization may contribute to the improvement of visual perception at attended times. This link suggests that in addition to cortical areas, subcortical areas mediating eye movements may be recruited with temporal attention.


2020 ◽  
Vol 34 (1) ◽  
pp. 17-47
Author(s):  
Minke J. de Boer ◽  
Deniz Başkent ◽  
Frans W. Cornelissen

Abstract The majority of emotional expressions used in daily communication are multimodal and dynamic in nature. Consequently, one would expect that human observers utilize specific perceptual strategies to process emotions and to handle the multimodal and dynamic nature of emotions. However, our present knowledge on these strategies is scarce, primarily because most studies on emotion perception have not fully covered this variation, and instead used static and/or unimodal stimuli with few emotion categories. To resolve this knowledge gap, the present study examined how dynamic emotional auditory and visual information is integrated into a unified percept. Since there is a broad spectrum of possible forms of integration, both eye movements and accuracy of emotion identification were evaluated while observers performed an emotion identification task in one of three conditions: audio-only, visual-only video, or audiovisual video. In terms of adaptations of perceptual strategies, eye movement results showed a shift in fixations toward the eyes and away from the nose and mouth when audio is added. Notably, in terms of task performance, audio-only performance was mostly significantly worse than video-only and audiovisual performances, but performance in the latter two conditions was often not different. These results suggest that individuals flexibly and momentarily adapt their perceptual strategies to changes in the available information for emotion recognition, and these changes can be comprehensively quantified with eye tracking.


2016 ◽  
Vol 36 (43) ◽  
pp. 11097-11106 ◽  
Author(s):  
G. von Trapp ◽  
B. N. Buran ◽  
K. Sen ◽  
M. N. Semple ◽  
D. H. Sanes

2020 ◽  
Vol 10 (13) ◽  
pp. 4508 ◽  
Author(s):  
Armel Quentin Tchanou ◽  
Pierre-Majorique Léger ◽  
Jared Boasen ◽  
Sylvain Senecal ◽  
Jad Adam Taher ◽  
...  

Gaze convergence of multiuser eye movements during simultaneous collaborative use of a shared system interface has been proposed as an important albeit sparsely explored construct in human-computer interaction literature. Here, we propose a novel index for measuring the gaze convergence of user dyads and address its validity through two consecutive eye-tracking studies. Eye-tracking data of user dyads were synchronously recorded while they simultaneously performed tasks on shared system interfaces. Results indicate the validity of the proposed gaze convergence index for measuring the gaze convergence of dyads. Moreover, as expected, our gaze convergence index was positively associated with dyad task performance and negatively associated with dyad cognitive load. These results suggest the utility of (theoretical or practical) applications such as synchronized gaze convergence displays in diverse settings. Further research perspectives, particularly into the construct’s nomological network, are warranted.


Sign in / Sign up

Export Citation Format

Share Document