scholarly journals Perisaccadic Stereopsis from Zero Retinal Disparity

2010 ◽  
Vol 10 (7) ◽  
pp. 331-331
Author(s):  
Z.-L. Zhang ◽  
C. Cantor ◽  
C. Schor
Keyword(s):  
1993 ◽  
Vol 2 (1) ◽  
pp. 34-43 ◽  
Author(s):  
Larry F. Hodges ◽  
Elizabeth Thorpe Davis

We examine the relationship among the different geometries implicit in a stereoscopic virtual environment. In particular, we examine in detail the relationship of retinal disparity, fixation point, binocular visual direction, and screen parallax. We introduce the concept of a volumetric spatial unit called a stereoscopic voxel. Due to the shape of stereoscopic voxels, apparent depth of points in space may be affected by their horizontal placement.


1989 ◽  
Vol 33 (2) ◽  
pp. 23-27 ◽  
Author(s):  
Thomas C. Way

Sixteen military pilots flew simulated air-to-air and air-to-ground missions in a simulated fighter-attack cockpit. Three of the five color CRTs in the cockpit were capable of displaying retinal disparity and the major independent variable was presence or absence of disparity. Performance, workload, and opinion data were collected. A second objective of the study was to continue development of the display formats, which had evolved through earlier projects. The disparity results and the recommended format revisions are presented.


Perception ◽  
1994 ◽  
Vol 23 (9) ◽  
pp. 1037-1048 ◽  
Author(s):  
Sachio Nakamizo ◽  
Koichi Shimono ◽  
Michiaki Kondo ◽  
Hiroshi Ono

Visual directions of the two stimuli in Panum's limiting case with different interstimulus and convergence distances confirmed the predictions from the reformulated Wells—Hering's laws of visual direction. In experiment 1, six observers each converged on the midpoint of the interstimulus axis at 30, 60, and 90 cm from the eyes and adjusted a probe on the fixation plane to be in the same visual direction as that of each stimulus. Visual direction of the far stimulus was always nonveridical whereas that of the near stimulus was veridical only when its retinal disparity was small. In experiment 2, three observers each converged on the intersection of mid-sagittal plane and (a) the frontoparallel plane of the near stimulus, (b) that of the midpoint between the two stimuli, or (c) that of the far stimulus. The midpoint of the interstimulus axis was 60 cm from the eyes. Visual direction of the far stimulus was veridical only with convergence at the far plane. Visual direction of the near stimulus was veridical with convergence at the near plane, and also, only when its retinal disparity was small, with convergence at the two other planes.


Perception ◽  
1996 ◽  
Vol 25 (1_suppl) ◽  
pp. 152-152
Author(s):  
K Susami ◽  
H Kaneko ◽  
H Ashida

While an object is moving in depth, its retinal disparity and size are cooperatively changing along the viewing distance. We examined the effect of a cooperative relation between the two cues for the perception of motion in depth—changes in disparity and size—with the Wheatstone stereoscopic display. In experiment 1, we used a stereoscopic stimulus whose disparity and size were independently modulated with sine-wave form, but at different frequencies (0.7 Hz vs 0.8 Hz, and vice versa). So, the cooperative and the uncooperative phases between the two cues repeatedly followed each other. The subjects continuously pushed a response key when the stimulus was clearly perceived to be moving in depth. In general, the impression of motion in depth was clear when the two cues were simultaneously modulated in similar phase, but not in different phase. In experiment 2, we measured the perceived distance of a stimulus that is moving in depth, when the two cues were moderated in the same phase and in counterphases. The perceived distance was increased when the two cues moved in the same phase. We found that not only the effect of each cue, but also the effect of the cooperative change of the two cues was affecting the perception of motion in depth. These results suggest that the cooperative interaction of two cues, that is their relative phase, is important for the perception of motion in depth.


2008 ◽  
Vol 8 (16) ◽  
pp. 3-3 ◽  
Author(s):  
G. Blohm ◽  
A. Z. Khan ◽  
L. Ren ◽  
K. M. Schreiber ◽  
J. D. Crawford

1980 ◽  
Vol 32 (3) ◽  
pp. 387-395 ◽  
Author(s):  
M. J. Morgan ◽  
Roger Ward

Brief apparent motion sequences were introduced into a dynamic visual dot display by spatially shifting selected dots between successive frames. This causes the display to look as if it is drifting continuously in one direction. When such a display is observed with an interocular delay the drifting dots appear to be displaced in depth, even though there is no conventional retinal disparity in the display. We found that the magnitude of this depth shift increased with the duration of the apparent motion sequences. With sequences of five or more frames duration the depth effect was very similar to that which would have been predicted with a continuously moving target. With briefer sequences the size of the depth effect decreased rapidly. We suggest that apparent motion cascades form the basis of Tyler's dynamic visual noise stereophenomenon, and we question his “random spatial disparity” hypothesis.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Arvind Chandna ◽  
Jeremy Badler ◽  
Devashish Singh ◽  
Scott Watamaniuk ◽  
Stephen Heinen

AbstractTo clearly view approaching objects, the eyes rotate inward (vergence), and the intraocular lenses focus (accommodation). Current ocular control models assume both eyes are driven by unitary vergence and unitary accommodation commands that causally interact. The models typically describe discrete gaze shifts to non-accommodative targets performed under laboratory conditions. We probe these unitary signals using a physical stimulus moving in depth on the midline while recording vergence and accommodation simultaneously from both eyes in normal observers. Using monocular viewing, retinal disparity is removed, leaving only monocular cues for interpreting the object’s motion in depth. The viewing eye always followed the target’s motion. However, the occluded eye did not follow the target, and surprisingly, rotated out of phase with it. In contrast, accommodation in both eyes was synchronized with the target under monocular viewing. The results challenge existing unitary vergence command theories, and causal accommodation-vergence linkage.


1968 ◽  
Vol 26 (2) ◽  
pp. 367-370 ◽  
Author(s):  
Robert H. Cormack ◽  
Ruth Arger

Necker cube reversal rate and dominance of cube orientation were measured for 60 Ss under three conditions of disparity, both with and without steady fixation. Passive instructions were given. Retinal disparity increased mildly the dominance of the orientation consonant with the disparity. This effect was greater without a fixation point. Retinal disparity affected reversal rate, but the presence or absence of a fixation point did not.


2018 ◽  
Author(s):  
Ronny Rosner ◽  
Joss von Hadeln ◽  
Ghaith Tarawneh ◽  
Jenny C. A. Read

A puzzle for neuroscience - and robotics - is how insects achieve surprisingly complex behaviours with such tiny brains1,2. One example is depth perception via binocular stereopsis in the praying mantis, a predatory insect. Praying mantids use stereopsis, the computation of distances from disparities between the two retinas, to trigger a raptorial strike of their forelegs3,4 when prey is within reach. The neuronal basis of this ability is entirely unknown. From behavioural evidence, one view is that the mantis brain must measure retinal disparity locally across a range of distances and eccentricities4–7, very like disparity-tuned neurons in vertebrate visual cortex8. Sceptics argue that this “retinal disparity hypothesis” implies far too many specialised neurons for such a tiny brain9. Here we show the first evidence that individual neurons in the praying mantis brain are indeed tuned to specific disparities and eccentricities, and thus locations in 3D-space. This disparity information is transmitted to the central brain by neurons connecting peripheral visual areas in both hemispheres, as well as by a unilateral neuron type. Like disparity-tuned cortical cells in vertebrates, the responses of these mantis neurons are consistent with linear summation of binocular inputs followed by an output nonlinearity10. Additionally, centrifugal neurons project disparity information back from the central brain to early visual areas, possibly for gain modulation or 3D spatial attention. Thus, our study not only proves the retinal disparity hypothesis for insects, it reveals feedback connections hitherto undiscovered in any animal species.


Sign in / Sign up

Export Citation Format

Share Document