scholarly journals An improved LPTC neural model for background motion direction estimation

Author(s):  
Hongxin Wang ◽  
Jigen Peng ◽  
Shigang Yue
2013 ◽  
Vol 846-847 ◽  
pp. 1106-1110
Author(s):  
Guo Qing Yang ◽  
Rong Yi Cui

Taking the wavelet decomposed approximate image as the main research object, a direction estimation method for moving object was proposed in this paper. Firstly, the approximate image for the frame of the video was obtained via wavelet decomposition; and furthermore, the motion estimation on the approximate image was achieved to obtain the motion vectors. Finally, the motion vectors were described as polar coordinate form to compute the number of motion vectors in specified angles and the information entropy of the motion directions. The experiment results show that the proposed method can remove the effect of noise and the results of direction estimation are consistent with the actual motion directions.


2001 ◽  
Vol 13 (1) ◽  
pp. 102-120 ◽  
Author(s):  
Christopher Pack ◽  
Stephen Grossberg ◽  
Ennio Mingolla

Smooth pursuit eye movements (SPEMs) are eye rotations that are used to maintain fixation on a moving target. Such rotations complicate the interpretation of the retinal image, because they nullify the retinal motion of the target, while generating retinal motion of stationary objects in the background. This poses a problem for the oculomotor system, which must track the stabilized target image while suppressing the optokinetic reflex, which would move the eye in the direction of the retinal background motion (opposite to the direction in which the target is moving). Similarly, the perceptual system must estimate the actual direction and speed of moving objects in spite of the confounding effects of the eye rotation. This paper proposes a neural model to account for the ability of primates to accomplish these tasks. The model simulates the neurophysiological properties of cell types found in the superior temporal sulcus of the macaque monkey, specifically the medial superior temporal (MST) region. These cells process signals related to target motion, background motion, and receive an efference copy of eye velocity during pursuit movements. The model focuses on the interactions between cells in the ventral and dorsal subdivisions of MST, which are hypothesized to process target velocity and background motion, respectively. The model explains how these signals can be combined to explain behavioral data about pursuit maintenance and perceptual data from human studies, including the Aubert-Fleischl phenomenon and the Filehne Illusion, thereby clarifying the functional significance of neurophysiological data about these MST cell properties. It is suggested that the connectivity used in the model may represent a general strategy used by the brain in analyzing the visual world.


2014 ◽  
Vol 112 (5) ◽  
pp. 1074-1081 ◽  
Author(s):  
David Souto ◽  
Dirk Kerzel

Involuntary ocular tracking responses to background motion offer a window on the dynamics of motion computations. In contrast to spatial attention, we know little about the role of feature-based attention in determining this ocular response. To probe feature-based effects of background motion on involuntary eye movements, we presented human observers with a balanced background perturbation. Two clouds of dots moved in opposite vertical directions while observers tracked a target moving in horizontal direction. Additionally, they had to discriminate a change in the direction of motion (±10° from vertical) of one of the clouds. A vertical ocular following response occurred in response to the motion of the attended cloud. When motion selection was based on motion direction and color of the dots, the peak velocity of the tracking response was 30% of the tracking response elicited in a single task with only one direction of background motion. In two other experiments, we tested the effect of the perturbation when motion selection was based on color, by having motion direction vary unpredictably, or on motion direction alone. Although the gain of pursuit in the horizontal direction was significantly reduced in all experiments, indicating a trade-off between perceptual and oculomotor tasks, ocular responses to perturbations were only observed when selection was based on both motion direction and color. It appears that selection by motion direction can only be effective for driving ocular tracking when the relevant elements can be segregated before motion onset.


2018 ◽  
Vol 18 (10) ◽  
pp. 130
Author(s):  
Kathryn Bonnen ◽  
Thaddeus Czuba ◽  
Jake Whritner ◽  
Austin Kuo ◽  
Alexander Huk ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document