The role of ocular muscle proprioception in visual localization of targets

Science ◽  
1990 ◽  
Vol 249 (4964) ◽  
pp. 58-61 ◽  
Author(s):  
G. Gauthier ◽  
D Nommay ◽  
J. Vercher
Brain ◽  
1990 ◽  
Vol 113 (6) ◽  
pp. 1857-1871 ◽  
Author(s):  
GABRIEL M. GAUTHIER ◽  
DANIELLE NOMMAY ◽  
JEAN-LOUIS VERCHER

1995 ◽  
Vol 23 (4) ◽  
pp. 423-435 ◽  
Author(s):  
Gabriel M. Gauthier ◽  
Jean-Louis Vercher ◽  
Jean Blouin

1997 ◽  
Vol 37 (6) ◽  
pp. 769-774 ◽  
Author(s):  
PAUL van DONKELAAR ◽  
GABRIEL M GAUTHIER ◽  
JEAN BLOUIN ◽  
JEAN-LOUIS VERCHER

1995 ◽  
pp. 550-553
Author(s):  
Gabriel M. Gauthier ◽  
Jean-Louis Vercher ◽  
Jean Blouin

Author(s):  
P. Trusheim ◽  
C. Heipke

Abstract. Localization is one of the first steps in navigation. Especially due to the rapid development in automated driving, a precise and reliable localization becomes essential. In this paper, we report an investigation of the usage of dynamic ground control points (GCP) in visual localization in an automotive environment. Instead of having fixed positions, dynamic GCPs move together with the camera. As a measure of quality, we employ the precision of the bundle adjustment results. In our experiments, we simulate and investigate different realistic traffic scenarios. After investigating the role of tie points, we compare an approach using dynamic GCPs to an approach with static GCPs to answer the question how a comparable precision can be reached for visual localization. We show, that in our scenario, where two dynamic GCPs move together with a camera, similar results are indeed obtained to using a number of static GCPs distributed over the whole trajectory. In another experiment, we take a closer look at sliding window bundle adjustments. Sliding windows make it possible to work with an arbitrarily large number of images and to still obtain near real-time results. We investigate this approach in combination with dynamic GCPs and vary the no. of images per window.


Perception ◽  
1989 ◽  
Vol 18 (1) ◽  
pp. 93-104 ◽  
Author(s):  
Stefan Mateeff ◽  
Joachim Hohnsbein

Subjects used eye movements to pursue a light target that moved from left to right with a velocity of 15 deg s−1. The stimulus was a sudden five-fold decrease in target intensity during the movement. The subject's task was to localize the stimulus relative to either a single stationary background point or the midpoint between two points (28 deg apart) placed 0.5 deg above the target path. The stimulus was usually mislocated in the direction of eye movement; the mislocation was affected by the spatial adjacency between background and stimulus. When an auditory, rather than a visual, stimulus was presented during tracking, target position at the time of stimulus presentation was visually mislocated in the direction opposite to that of eye movement. The effect of adjacency between background and target remained the same. The involvement of processes of subject-relative and object-relative visual perception is discussed.


Sign in / Sign up

Export Citation Format

Share Document