Using Eye-Tracking to Study Audio — Visual Perceptual Integration

Perception ◽  
10.1068/p5731 ◽  
2007 ◽  
Vol 36 (9) ◽  
pp. 1391-1395 ◽  
Author(s):  
Mei Xiao ◽  
May Wong ◽  
Michelle Umali ◽  
Marc Pomplun

Perceptual integration of audio—visual stimuli is fundamental to our everyday conscious experience. Eye-movement analysis may be a suitable tool for studying such integration, since eye movements respond to auditory as well as visual input. Previous studies have shown that additional auditory cues in visual-search tasks can guide eye movements more efficiently and reduce their latency. However, these auditory cues were task-relevant since they indicated the target position and onset time. Therefore, the observed effects may have been due to subjects using the cues as additional information to maximize their performance, without perceptually integrating them with the visual displays. Here, we combine a visual-tracking task with a continuous, task-irrelevant sound from a stationary source to demonstrate that audio—visual perceptual integration affects low-level oculomotor mechanisms. Auditory stimuli of constant, increasing, or decreasing pitch were presented. All sound categories induced more smooth-pursuit eye movement than silence, with the greatest effect occurring with stimuli of increasing pitch. A possible explanation is that integration of the visual scene with continuous sound creates the perception of continuous visual motion. Increasing pitch may amplify this effect through its common association with accelerating motion.

Author(s):  
Gavindya Jayawardena ◽  
Sampath Jayarathna

Eye-tracking experiments involve areas of interest (AOIs) for the analysis of eye gaze data. While there are tools to delineate AOIs to extract eye movement data, they may require users to manually draw boundaries of AOIs on eye tracking stimuli or use markers to define AOIs. This paper introduces two novel techniques to dynamically filter eye movement data from AOIs for the analysis of eye metrics from multiple levels of granularity. The authors incorporate pre-trained object detectors and object instance segmentation models for offline detection of dynamic AOIs in video streams. This research presents the implementation and evaluation of object detectors and object instance segmentation models to find the best model to be integrated in a real-time eye movement analysis pipeline. The authors filter gaze data that falls within the polygonal boundaries of detected dynamic AOIs and apply object detector to find bounding-boxes in a public dataset. The results indicate that the dynamic AOIs generated by object detectors capture 60% of eye movements & object instance segmentation models capture 30% of eye movements.


2002 ◽  
Vol 87 (2) ◽  
pp. 802-818 ◽  
Author(s):  
Masaki Tanaka ◽  
Stephen G. Lisberger

Periarcuate frontal cortex is involved in the control of smooth pursuit eye movements, but its role remains unclear. To better understand the control of pursuit by the “frontal pursuit area” (FPA), we applied electrical microstimulation when the monkeys were performing a variety of oculomotor tasks. In agreement with previous studies, electrical stimulation consisting of a train of 50-μA pulses at 333 Hz during fixation of a stationary target elicited smooth eye movements with a short latency (∼26 ms). The size of the elicited smooth eye movements was enhanced when the stimulation pulses were delivered during the maintenance of pursuit. The enhancement increased as a function of ongoing pursuit speed and was greater during pursuit in the same versus opposite direction of the eye movements evoked at a site. If stimulation was delivered during pursuit in eight different directions, the elicited eye velocity was fit best by a model incorporating two stimulation effects: a directional signal that drives eye velocity and an increase in the gain of ongoing pursuit eye speed in all directions. Separate experiments tested the effect of stimulation on the response to specific image motions. Stimulation consisted of a train of pulses at 100 or 200 Hz delivered during fixation so that only small smooth eye movements were elicited. If the stationary target was perturbed briefly during microstimulation, normally weak eye movement responses showed strong enhancement. If delivered at the initiation of pursuit, the same microstimulation caused enhancement of the presaccadic initiation of pursuit for steps of target velocity that moved the target either away from the position of fixation or in the direction of the eye movement caused by stimulation at the site. Stimulation in the FPA increased the latency of saccades to stationary or moving targets. Our results show that the FPA has two kinds of effects on the pursuit system. One drives smooth eye velocity in a fixed direction and is subject to on-line gain control by ongoing pursuit. The other causes enhancement of both the speed of ongoing pursuit and the responses to visual motion in a way that is not strongly selective for the direction of pursuit. Enhancement may operate either at a single site or at multiple sites. We conclude that the FPA plays an important role in on-line gain control for pursuit as well as possibly delivering commands for the direction and speed of smooth eye motion.


1997 ◽  
Vol 14 (2) ◽  
pp. 323-338 ◽  
Author(s):  
Vincent P. Ferrera ◽  
Stephen G. Lisberger

AbstractAs a step toward understanding the mechanism by which targets are selected for smooth-pursuit eye movements, we examined the behavior of the pursuit system when monkeys were presented with two discrete moving visual targets. Two rhesus monkeys were trained to select a small moving target identified by its color in the presence of a moving distractor of another color. Smooth-pursuit eye movements were quantified in terms of the latency of the eye movement and the initial eye acceleration profile. We have previously shown that the latency of smooth pursuit, which is normally around 100 ms, can be extended to 150 ms or shortened to 85 ms depending on whether there is a distractor moving in the opposite or same direction, respectively, relative to the direction of the target. We have now measured this effect for a 360 deg range of distractor directions, and distractor speeds of 5–45 deg/s. We have also examined the effect of varying the spatial separation and temporal asynchrony between target and distractor. The results indicate that the effect of the distractor on the latency of pursuit depends on its direction of motion, and its spatial and temporal proximity to the target, but depends very little on the speed of the distractor. Furthermore, under the conditions of these experiments, the direction of the eye movement that is emitted in response to two competing moving stimuli is not a vectorial combination of the stimulus motions, but is solely determined by the direction of the target. The results are consistent with a competitive model for smooth-pursuit target selection and suggest that the competition takes place at a stage of the pursuit pathway that is between visual-motion processing and motor-response preparation.


2017 ◽  
Author(s):  
Uday K. Jagadisan ◽  
Neeraj J. Gandhi

AbstractThe trigeminal blink reflex can be evoked by delivering an air puff to the eye. If timed appropriately, e.g., during motor preparation, the small, loopy blink-related eye movement (BREM) associated with eyelid closure disinhibits the saccadic system and reduces the reaction time of planned eye movements. The BREM and intended eye movement overlap temporally, thus a mathematical formulation is required to objectively extract saccade features – onset time and velocity profile – from the combined movement. While it has been assumed that the interactions are nonlinear, we show that blink-triggered movements can be modeled as a linear combination of a typical BREM and a normal saccade, crucially, with an imposed delay between the two components. Saccades reconstructed with this approach are largely similar to control movements in their temporal and spatial profiles. Furthermore, activity profiles of saccade-related bursts in superior colliculus neurons for the recovered saccades closely match those for normal saccades. Thus, blink perturbations, if properly accounted for, offer a non-invasive tool to probe the behavioral and neural signatures of sensory-to-motor transformations.New and noteworthyThe trigeminal blink reflex is a brief noninvasive perturbation that disinhibits the saccadic system and provides a behavioral readout of the latent motor preparation process. The saccade, however, is combined with a loopy blink related eye movement. Here, we provide a mathematical formulation to extract the saccade from the combined movement. Thus, blink perturbations, when properly accounted for, offer a non-invasive tool to probe the behavioral and neural signatures of sensory-to-motor transformations.


1984 ◽  
Vol 7 (1) ◽  
pp. 53-66 ◽  
Author(s):  
Michiel P. van Oeffelen ◽  
Peter G. Vos

The present study reports the measurement of response latencies and the recording of eye movement in a task where children of about 5.5 years had to count arrangements of 1-8 dots in different configurations. Consistent with earlier findings, response latencies for numbers up to 5 suggested subitizing rather than counting strategies. Data from concomittant eye movement recordings clearly showed that even the processing of the small numbers required at least four fixations per response. Records of eye movements under the conditions of numbers of dots larger than n = 5 were found to reflect mixed strategies and not elementary one-by-one counting procedures. It is hypothesized that large processing times in comparison with adults were mainly due to interim verifications of results already established: children were, much more than adults, mentally loaded by the double task of storing partial results and processing new information at the same time.


1989 ◽  
Vol 61 (1) ◽  
pp. 173-185 ◽  
Author(s):  
S. G. Lisberger ◽  
T. A. Pavelko

1. The goal of our study was to determine the properties of the visual inputs for pursuit eye movements. In a previous study we presented horizontal target motion along the horizontal meridian and showed that targets were more effective if they moved across the center of the visual field. We have now analyzed the topographic weighting of the inputs for pursuit in greater detail, using targets that moved in all directions and across a wide area of the visual field. 2. Monkeys were rewarded for tracking targets that started at 48 positions in the visual field. The initial positions were spaced equally around 4 circles that were centered at the position of fixation and had radii of 3, 6, 9, and 12 degrees. Targets moved horizontally or vertically at 30 degrees/s. We measured the smooth eye acceleration in the first 80 ms after the initiation of pursuit, before there had been time for visual feedback to affect the position or velocity of the retinal images from the target. 3. For both horizontal and vertical target motion, there were major differences between the early and late intervals in the first 80 ms of pursuit. In the first 20 ms eye acceleration was largely independent of initial target position. In later intervals eye acceleration decreased sharply as a function of initial target eccentricity. The later intervals also showed a pronounced toward/away asymmetry such that the initiation of pursuit was more vigorous for target motion toward than for motion away from the horizontal or vertical meridian. 4. Comparison of the topographic organization of the middle temporal visual area (MT) with our data on pursuit suggests that the topography of cortical maps is smoothed when the visual signals are transmitted to the pursuit system. For example, the superior visual hemifield is underrepresented in cortical motion processing areas, but target motion in the superior and inferior visual hemifields is equally effective for the initiation of pursuit. 5. We investigated the directional organization of the visual inputs for pursuit by presenting targets that started at 6 degrees eccentric and moved in 16 different directions. Horizontal target motion always evoked larger eye accelerations than did vertical target motion. Target motion in oblique directions evoked intermediate values of eye acceleration. 6. Our data show two classes of variation in pursuit performance. First, some subjects showed ideosyncratic variations that were restricted to one hemifield or one direction of target motion. We attribute these variations to differences among subjects in the physiology of visual pathways.(ABSTRACT TRUNCATED AT 400 WORDS)


Author(s):  
Saptarshi Mandal ◽  
Ziho Kang ◽  
Angel Millan

Visualization approaches for eye movement analysis suffers from two limitations: (1) inability to handle the stochasticity of both number and position of moving areas of interests (AOIs), and (2) absence of quantitative metrics to analyze eye movement data. We adapted the directed weighted network (DWN) and associated “centrality” metrics to support the visualization of the complex eye movement data. A case study was performed using a realistic air traffic control task environment. Promising results were found as we were able identify important targets (aircraft) interrogated by an air traffic controller based on different time frames. This case study serves as a foundation to develop effective data visualization methods and quantitative metrics for analyzing complex eye movements for a multi-element tracking task.


2007 ◽  
Vol 97 (5) ◽  
pp. 3473-3483 ◽  
Author(s):  
Naoko Inaba ◽  
Shigeru Shinomoto ◽  
Shigeru Yamane ◽  
Aya Takemura ◽  
Kenji Kawano

When a person tracks a small moving object, the visual images in the background of the visual scene move across his/her retina. It, however, is possible to estimate the actual motion of the images despite the eye-movement-induced motion. To understand the neural mechanism that reconstructs a stable visual world independent of eye movements, we explored areas MT (middle temporal) and MST (medial superior temporal) in the monkey cortex, both of which are known to be essential for visual motion analysis. We recorded the responses of neurons to a moving textured image that appeared briefly on the screen while the monkeys were performing smooth pursuit or stationary fixation tasks. Although neurons in both areas exhibited significant responses to the motion of the textured image with directional selectivity, the responses of MST neurons were mostly correlated with the motion of the image on the screen independent of pursuit eye movement, whereas the responses of MT neurons were mostly correlated with the motion of the image on the retina. Thus these MST neurons were more likely than MT neurons to distinguish between external and self-induced motion. The results are consistent with the idea that MST neurons code for visual motion in the external world while compensating for the counter-rotation of retinal images due to pursuit eye movements.


2020 ◽  
Author(s):  
Xiuyun Wu ◽  
Austin C. Rothwell ◽  
Miriam Spering ◽  
Anna Montagnini

AbstractSmooth pursuit eye movements and visual motion perception rely on the integration of current sensory signals with past experience. Experience shapes our expectation of current visual events and can drive eye movement responses made in anticipation of a target, such as anticipatory pursuit. Previous research revealed consistent effects of expectation on anticipatory pursuit—eye movements follow the expected target direction or speed—and contrasting effects on motion perception, but most studies considered either eye movement or perceptual responses. The current study directly compared effects of direction expectation on perception and anticipatory pursuit within the same direction discrimination task to investigate whether both types of responses are affected similarly or differently. Observers (n = 10) viewed high-coherence random-dot kinematograms (RDKs) moving rightward and leftward with a probability of 50, 70, or 90% in a given block of trials to build up an expectation of motion direction. They were asked to judge motion direction of interleaved low-coherence RDKs (0-15%). Perceptual judgements were compared to changes in anticipatory pursuit eye movements as a function of probability. Results show that anticipatory pursuit velocity scaled with probability and followed direction expectation (attraction bias), whereas perceptual judgments were biased opposite to direction expectation (repulsion bias). Control experiments suggest that the repulsion bias in perception was not caused by retinal slip induced by anticipatory pursuit, or by motion adaptation. We conclude that direction expectation can be processed differently for perception and anticipatory pursuit.


1998 ◽  
Vol 79 (4) ◽  
pp. 1918-1930 ◽  
Author(s):  
Stephen G. Lisberger

Lisberger, Stephen G. Postsaccadic enhancement of initiation of smooth pursuit eye movements in monkeys. J. Neurophysiol. 79: 1918–1930, 1998. Step-ramp target motion evokes a characteristic sequence of presaccadic smooth eye movement in the direction of the target ramp, catch-up targets to bring eye position close to the position of the moving target, and postsaccadic eye velocities that nearly match target velocity. I have analyzed this sequence of eye movements in monkeys to reveal a strong postsaccadic enhancement of pursuit eye velocity and to document the conditions that lead to that enhancement. Smooth eye velocity was measured in the last 10 ms before and the first 10 ms after the first saccade evoked by step-ramp target motion. Plots of eye velocity as a function of time after the onset of the target ramp revealed that eye velocity at a given time was much higher if measured after versus before the saccade. Postsaccadic enhancement of pursuit was recorded consistently when the target stepped 3° eccentric on the horizontal axis and moved upward, downward, or away from the position of fixation. To determine whether postsaccadic enhancement of pursuit was invoked by smear of the visual scene during a saccade, I recorded the effect of simulated saccades on the presaccadic eye velocity for step-ramp target motion. The 3° simulated saccade, which consisted of motion of a textured background at 150°/s for 20 ms, failed to cause any enhancement of presaccadic eye velocity. By using a strategically selected set of oblique target steps with horizontal ramp target motion, I found clear enhancement for saccades in all directions, even those that were orthogonal to target motion. When the size of the target step was varied by up to 15° along the horizontal meridian, postsaccadic eye velocity did not depend strongly either on the initial target position or on whether the target moved toward or away from the position of fixation. In contrast, earlier studies and data in this paper show that presaccadic eye velocity is much stronger when the target is close to the center of the visual field and when the target moves toward versus away from the position of fixation. I suggest that postsaccadic enhancement of pursuit reflects activation, by saccades, of a switch that regulates the strength of transmission through the visual-motor pathways for pursuit. Targets can cause strong visual motion signals but still evoke low presaccadic eye velocities if they are ineffective at activating the pursuit system.


Sign in / Sign up

Export Citation Format

Share Document