scholarly journals Temporal dynamics of heading perception and identification of scene-relative object motion from optic flow

2019 ◽  
Vol 19 (10) ◽  
pp. 236c
Author(s):  
Li Li ◽  
Mingyang Xie
PLoS ONE ◽  
2015 ◽  
Vol 10 (5) ◽  
pp. e0126265 ◽  
Author(s):  
Yu-Jen Lee ◽  
H. Olof Jönsson ◽  
Karin Nordström

2020 ◽  
Vol 20 (8) ◽  
pp. 18
Author(s):  
Krischan Koerfer ◽  
Markus Lappe

1997 ◽  
Vol 14 (5) ◽  
pp. 879-895 ◽  
Author(s):  
Helen Sherk ◽  
Kathleen Mulligan ◽  
Jong-Nam Kim

AbstractDuring locomotion, observers respond to objects in the environment that may represent obstacles to avoid or landmarks for navigation. Although much is known about how visual cortical neurons respond to stimulus objects moving against a blank background, nothing is known about their responses when objects are embedded in optic flow fields (the patterns of motion seen during locomotion). We recorded from cells in the lateral suprasylvian visual area (LS) of the cat, an area probably analogous to area MT. In our first experiments, optic flow simulations mimicked the view of a cat trotting across a plain covered with small balls; a black bar lying on the balls served as a target object. In subsequent experiments, optic flow simulations were composed of natural elements, with target objects representing bushes, rocks, and variants of these. Cells did not respond to the target bar in the presence of optic flow backgrounds, although they did respond to it in the absence of a background. However, 273/423 cells responded to at least one of the taller, naturalistic objects embedded in optic flow simulations. These responses might represent a form of image segmentation, in that cells detected objects against a complex background. Surprisingly, the responsiveness of cells to objects in optic flow fields was not correlated with preferred direction as measured with a moving bar or whole-field texture. Because the direction of object motion was determined solely by receptive-field location, it often differed considerably from a cell's preferred direction. About a quarter of the cells responded well to objects in optic flow movies but more weakly or not at all to bars moving in the same direction as the object, suggesting that the optic flow background modified or suppressed direction selectivity.


2016 ◽  
Vol 115 (1) ◽  
pp. 286-300 ◽  
Author(s):  
Oliver W. Layton ◽  
Brett R. Fajen

Many forms of locomotion rely on the ability to accurately perceive one's direction of locomotion (i.e., heading) based on optic flow. Although accurate in rigid environments, heading judgments may be biased when independently moving objects are present. The aim of this study was to systematically investigate the conditions in which moving objects influence heading perception, with a focus on the temporal dynamics and the mechanisms underlying this bias. Subjects viewed stimuli simulating linear self-motion in the presence of a moving object and judged their direction of heading. Experiments 1 and 2 revealed that heading perception is biased when the object crosses or almost crosses the observer's future path toward the end of the trial, but not when the object crosses earlier in the trial. Nonetheless, heading perception is not based entirely on the instantaneous optic flow toward the end of the trial. This was demonstrated in Experiment 3 by varying the portion of the earlier part of the trial leading up to the last frame that was presented to subjects. When the stimulus duration was long enough to include the part of the trial before the moving object crossed the observer's path, heading judgments were less biased. The findings suggest that heading perception is affected by the temporal evolution of optic flow. The time course of dorsal medial superior temporal area (MSTd) neuron responses may play a crucial role in perceiving heading in the presence of moving objects, a property not captured by many existing models.


Vision ◽  
2019 ◽  
Vol 3 (2) ◽  
pp. 13
Author(s):  
Pearl Guterman ◽  
Robert Allison

When the head is tilted, an objectively vertical line viewed in isolation is typically perceived as tilted. We explored whether this shift also occurs when viewing global motion displays perceived as either object-motion or self-motion. Observers stood and lay left side down while viewing (1) a static line, (2) a random-dot display of 2-D (planar) motion or (3) a random-dot display of 3-D (volumetric) global motion. On each trial, the line orientation or motion direction were tilted from the gravitational vertical and observers indicated whether the tilt was clockwise or counter-clockwise from the perceived vertical. Psychometric functions were fit to the data and shifts in the point of subjective verticality (PSV) were measured. When the whole body was tilted, the perceived tilt of both a static line and the direction of optic flow were biased in the direction of the body tilt, demonstrating the so-called A-effect. However, we found significantly larger shifts for the static line than volumetric global motion as well as larger shifts for volumetric displays than planar displays. The A-effect was larger when the motion was experienced as self-motion compared to when it was experienced as object-motion. Discrimination thresholds were also more precise in the self-motion compared to object-motion conditions. Different magnitude A-effects for the line and motion conditions—and for object and self-motion—may be due to differences in combining of idiotropic (body) and vestibular signals, particularly so in the case of vection which occurs despite visual-vestibular conflict.


i-Perception ◽  
2017 ◽  
Vol 8 (3) ◽  
pp. 204166951770820 ◽  
Author(s):  
Diederick C. Niehorster ◽  
Li Li

How do we perceive object motion during self-motion using visual information alone? Previous studies have reported that the visual system can use optic flow to identify and globally subtract the retinal motion component resulting from self-motion to recover scene-relative object motion, a process called flow parsing. In this article, we developed a retinal motion nulling method to directly measure and quantify the magnitude of flow parsing (i.e., flow parsing gain) in various scenarios to examine the accuracy and tuning of flow parsing for the visual perception of object motion during self-motion. We found that flow parsing gains were below unity for all displays in all experiments; and that increasing self-motion and object motion speed did not alter flow parsing gain. We conclude that visual information alone is not sufficient for the accurate perception of scene-relative motion during self-motion. Although flow parsing performs global subtraction, its accuracy also depends on local motion information in the retinal vicinity of the moving object. Furthermore, the flow parsing gain was constant across common self-motion or object motion speeds. These results can be used to inform and validate computational models of flow parsing.


1996 ◽  
Vol 82 (2) ◽  
pp. 627-635 ◽  
Author(s):  
Shinji Nakamura

To investigate the effects of background stimulation upon eye-movement information (EMI), the perceived deceleration of the target motion during pursuit eye movement (Aubert-Fleishl paradox) was analyzed. In the experiment, a striped pattern was used as a background stimulus with various brightness contrasts and spatial frequencies for serially manipulating the attributions of the background stimulus. Analysis showed that the retinal-image motion of the background stimulus (optic flow) affected eye-movement information and that the effects of optic flow became stronger when high contrast and low spatial frequency stripes were presented as the background stimulus. In conclusion, optic flow is one source of eye-movement information in determining real object motion, and the effectiveness of optic flow depends on the attributes of the background stimulus.


2011 ◽  
Vol 278 (1719) ◽  
pp. 2840-2847 ◽  
Author(s):  
F. J. Calabro ◽  
S. Soto-Faraco ◽  
L. M. Vaina

In humans, as well as most animal species, perception of object motion is critical to successful interaction with the surrounding environment. Yet, as the observer also moves, the retinal projections of the various motion components add to each other and extracting accurate object motion becomes computationally challenging. Recent psychophysical studies have demonstrated that observers use a flow-parsing mechanism to estimate and subtract self-motion from the optic flow field. We investigated whether concurrent acoustic cues for motion can facilitate visual flow parsing, thereby enhancing the detection of moving objects during simulated self-motion. Participants identified an object (the target) that moved either forward or backward within a visual scene containing nine identical textured objects simulating forward observer translation. We found that spatially co-localized, directionally congruent, moving auditory stimuli enhanced object motion detection. Interestingly, subjects who performed poorly on the visual-only task benefited more from the addition of moving auditory stimuli. When auditory stimuli were not co-localized to the visual target, improvements in detection rates were weak. Taken together, these results suggest that parsing object motion from self-motion-induced optic flow can operate on multisensory object representations.


Sign in / Sign up

Export Citation Format

Share Document