scholarly journals Optic flow and self-motion information during real-world locomotion

2017 ◽  
Vol 17 (10) ◽  
pp. 211
Author(s):  
Jonathan Matthis ◽  
Karl Muller ◽  
Kathryn Bonnen ◽  
Mary Hayhoe
2021 ◽  
Author(s):  
Yue Zhang ◽  
Ruoyu Huang ◽  
Wiebke Nörenberg ◽  
Aristides Arrenberg

The perception of optic flow is essential for any visually guided behavior of a moving animal. To mechanistically predict behavior and understand the emergence of self-motion perception in vertebrate brains, it is essential to systematically characterize the motion receptive fields (RFs) of optic flow processing neurons. Here, we present the fine-scale RFs of thousands of motion-sensitive neurons studied in the diencephalon and the midbrain of zebrafish. We found neurons that serve as linear filters and robustly encode directional and speed information of translation-induced optic flow. These neurons are topographically arranged in pretectum according to translation direction. The unambiguous encoding of translation enables the decomposition of translational and rotational self-motion information from mixed optic flow. In behavioral experiments, we successfully demonstrated the predicted decomposition in the optokinetic and optomotor responses. Together, our study reveals the algorithm and the neural implementation for self-motion estimation in a vertebrate visual system.


2001 ◽  
Vol 85 (2) ◽  
pp. 724-734 ◽  
Author(s):  
Holger G. Krapp ◽  
Roland Hengstenberg ◽  
Martin Egelhaaf

Integrating binocular motion information tunes wide-field direction-selective neurons in the fly optic lobe to respond preferentially to specific optic flow fields. This is shown by measuring the local preferred directions (LPDs) and local motion sensitivities (LMSs) at many positions within the receptive fields of three types of anatomically identifiable lobula plate tangential neurons: the three horizontal system (HS) neurons, the two centrifugal horizontal (CH) neurons, and three heterolateral connecting elements. The latter impart to two of the HS and to both CH neurons a sensitivity to motion from the contralateral visual field. Thus in two HS neurons and both CH neurons, the response field comprises part of the ipsi- and contralateral visual hemispheres. The distributions of LPDs within the binocular response fields of each neuron show marked similarities to the optic flow fields created by particular types of self-movements of the fly. Based on the characteristic distributions of local preferred directions and motion sensitivities within the response fields, the functional role of the respective neurons in the context of behaviorally relevant processing of visual wide-field motion is discussed.


2010 ◽  
Vol 103 (4) ◽  
pp. 1865-1873 ◽  
Author(s):  
Tao Zhang ◽  
Kenneth H. Britten

The ventral intraparietal area (VIP) of the macaque monkey is thought to be involved in judging heading direction based on optic flow. We recorded neuronal discharges in VIP while monkeys were performing a two-alternative, forced-choice heading discrimination task to relate quantitatively the activity of VIP neurons to monkeys' perceptual choices. Most VIP neurons were responsive to simulated heading stimuli and were tuned such that their responses changed across a range of forward trajectories. Using receiver operating characteristic (ROC) analysis, we found that most VIP neurons were less sensitive to small heading changes than was the monkey, although a minority of neurons were equally sensitive. Pursuit eye movements modestly yet significantly increased both neuronal and behavioral thresholds by approximately the same amount. Our results support the view that VIP activity is involved in self-motion judgments.


2010 ◽  
Vol 8 (6) ◽  
pp. 1155-1155
Author(s):  
J. Saunders ◽  
F. Durgin

i-Perception ◽  
2021 ◽  
Vol 12 (6) ◽  
pp. 204166952110557
Author(s):  
Diederick C. Niehorster

The concept of optic flow, a global pattern of visual motion that is both caused by and signals self-motion, is canonically ascribed to James Gibson's 1950 book “ The Perception of the Visual World.” There have, however, been several other developments of this concept, chiefly by Gwilym Grindley and Edward Calvert. Based on rarely referenced scientific literature and archival research, this article describes the development of the concept of optic flow by the aforementioned authors and several others. The article furthermore presents the available evidence for interactions between these authors, focusing on whether parts of Gibson's proposal were derived from the work of Grindley or Calvert. While Grindley's work may have made Gibson aware of the geometrical facts of optic flow, Gibson's work is not derivative of Grindley's. It is furthermore shown that Gibson only learned of Calvert's work in 1956, almost a decade after Gibson first published his proposal. In conclusion, the development of the concept of optic flow presents an intriguing example of convergent thought in the progress of science.


2020 ◽  
Author(s):  
Marvin Chancán

<div>Visual navigation tasks in real-world environments often require both self-motion and place recognition feedback. While deep reinforcement learning has shown success in solving these perception and decision-making problems in an end-to-end manner, these algorithms require large amounts of experience to learn navigation policies from high-dimensional data, which is generally impractical for real robots due to sample complexity. In this paper, we address these problems with two main contributions. We first leverage place recognition and deep learning techniques combined with goal destination feedback to generate compact, bimodal image representations that can then be used to effectively learn control policies from a small amount of experience. Second, we present an interactive framework, CityLearn, that enables for the first time training and deployment of navigation algorithms across city-sized, realistic environments with extreme visual appearance changes. CityLearn features more than 10 benchmark datasets, often used in visual place recognition and autonomous driving research, including over 100 recorded traversals across 60 cities around the world. We evaluate our approach on two CityLearn environments, training our navigation policy on a single traversal. Results show our method can be over 2 orders of magnitude faster than when using raw images, and can also generalize across extreme visual changes including day to night and summer to winter transitions.</div>


2018 ◽  
Vol 115 (7) ◽  
pp. E1637-E1646 ◽  
Author(s):  
Tale L. Bjerknes ◽  
Nenitha C. Dagslott ◽  
Edvard I. Moser ◽  
May-Britt Moser

Place cells in the hippocampus and grid cells in the medial entorhinal cortex rely on self-motion information and path integration for spatially confined firing. Place cells can be observed in young rats as soon as they leave their nest at around 2.5 wk of postnatal life. In contrast, the regularly spaced firing of grid cells develops only after weaning, during the fourth week. In the present study, we sought to determine whether place cells are able to integrate self-motion information before maturation of the grid-cell system. Place cells were recorded on a 200-cm linear track while preweaning, postweaning, and adult rats ran on successive trials from a start wall to a box at the end of a linear track. The position of the start wall was altered in the middle of the trial sequence. When recordings were made in complete darkness, place cells maintained fields at a fixed distance from the start wall regardless of the age of the animal. When lights were on, place fields were determined primarily by external landmarks, except at the very beginning of the track. This shift was observed in both young and adult animals. The results suggest that preweaning rats are able to calculate distances based on information from self-motion before the grid-cell system has matured to its full extent.


2020 ◽  
Vol 117 (27) ◽  
pp. 16065-16071 ◽  
Author(s):  
Yuli Wu ◽  
Kepu Chen ◽  
Yuting Ye ◽  
Tao Zhang ◽  
Wen Zhou

Human navigation relies on inputs to our paired eyes and ears. Although we also have two nasal passages, there has been little empirical indication that internostril differences yield directionality in human olfaction without involving the trigeminal system. By using optic flow that captures the pattern of apparent motion of surface elements in a visual scene, we demonstrate through formal psychophysical testing that a moderate binaral concentration disparity of a nontrigeminal odorant consistently biases recipients’ perceived direction of self-motion toward the higher-concentration side, despite that they cannot verbalize which nostril smells a stronger odor. We further show that the effect depends on the internostril ratio of odor concentrations and not the numeric difference in concentration between the two nostrils. Taken together, our findings provide behavioral evidence that humans smell in stereo and subconsciously utilize stereo olfactory cues in spatial navigation.


1998 ◽  
Vol 79 (3) ◽  
pp. 1461-1480 ◽  
Author(s):  
Markus Lappe ◽  
Martin Pekel ◽  
Klaus-Peter Hoffmann

Lappe, Markus, Martin Pekel, and Klaus-Peter Hoffmann. Optokinetic eye movements elicited by radial optic flow in the macaque monkey. J. Neurophysiol. 79: 1461–1480, 1998. We recorded spontaneous eye movements elicited by radial optic flow in three macaque monkeys using the scleral search coil technique. Computer-generated stimuli simulated forward or backward motion of the monkey with respect to a number of small illuminated dots arranged on a virtual ground plane. We wanted to see whether optokinetic eye movements are induced by radial optic flow stimuli that simulate self-movement, quantify their parameters, and consider their effects on the processing of optic flow. A regular pattern of interchanging fast and slow eye movements with a frequency of 2 Hz was observed. When we shifted the horizontal position of the focus of expansion (FOE) during simulated forward motion (expansional optic flow), median horizontal eye position also shifted in the same direction but only by a smaller amount; for simulated backward motion (contractional optic flow), median eye position shifted in the opposite direction. We relate this to a change in Schlagfeld typically observed in optokinetic nystagmus. Direction and speed of slow phase eye movements were compared with the local flow field motion in gaze direction (the foveal flow). Eye movement direction matched well the foveal motion. Small systematic deviations could be attributed to an integration of the global motion pattern. Eye speed on average did not match foveal stimulus speed, as the median gain was only ∼0.5–0.6. The gain was always lower for expanding than for contracting stimuli. We analyzed the time course of the eye movement immediately after each saccade. We found remarkable differences in the initial development of gain and directional following for expansion and contraction. For expansion, directional following and gain were initially poor and strongly influenced by the ongoing eye movement before the saccade. This was not the case for contraction. These differences also can be linked to properties of the optokinetic system. We conclude that optokinetic eye movements can be elicited by radial optic flow fields simulating self-motion. These eye movements are linked to the parafoveal flow field, i.e., the motion in the direction of gaze. In the retinal projection of the optic flow, such eye movements superimpose retinal slip. This results in complex retinal motion patterns, especially because the gain of the eye movement is small and variable. This observation has special relevance for mechanisms that determine self-motion from retinal flow fields. It is necessary to consider the influence of eye movements in optic flow analysis, but our results suggest that direction and speed of an eye movement should be treated differently.


Sign in / Sign up

Export Citation Format

Share Document