Navigation Depends on Enduring Allocentric Representations

2004 ◽  
Author(s):  
Weimin Mou ◽  
Timothy P. Mcnamara
2011 ◽  
Vol 366 (1564) ◽  
pp. 596-610 ◽  
Author(s):  
Benjamin W. Tatler ◽  
Michael F. Land

One of the paradoxes of vision is that the world as it appears to us and the image on the retina at any moment are not much like each other. The visual world seems to be extensive and continuous across time. However, the manner in which we sample the visual environment is neither extensive nor continuous. How does the brain reconcile these differences? Here, we consider existing evidence from both static and dynamic viewing paradigms together with the logical requirements of any representational scheme that would be able to support active behaviour. While static scene viewing paradigms favour extensive, but perhaps abstracted, memory representations, dynamic settings suggest sparser and task-selective representation. We suggest that in dynamic settings where movement within extended environments is required to complete a task, the combination of visual input, egocentric and allocentric representations work together to allow efficient behaviour. The egocentric model serves as a coding scheme in which actions can be planned, but also offers a potential means of providing the perceptual stability that we experience.


2020 ◽  
Vol 6 (8) ◽  
pp. eaaz2322 ◽  
Author(s):  
Andrew S. Alexander ◽  
Lucas C. Carstensen ◽  
James R. Hinman ◽  
Florian Raudies ◽  
G. William Chapman ◽  
...  

The retrosplenial cortex is reciprocally connected with multiple structures implicated in spatial cognition, and damage to the region itself produces numerous spatial impairments. Here, we sought to characterize spatial correlates of neurons within the region during free exploration in two-dimensional environments. We report that a large percentage of retrosplenial cortex neurons have spatial receptive fields that are active when environmental boundaries are positioned at a specific orientation and distance relative to the animal itself. We demonstrate that this vector-based location signal is encoded in egocentric coordinates, is localized to the dysgranular retrosplenial subregion, is independent of self-motion, and is context invariant. Further, we identify a subpopulation of neurons with this response property that are synchronized with the hippocampal theta oscillation. Accordingly, the current work identifies a robust egocentric spatial code in retrosplenial cortex that can facilitate spatial coordinate system transformations and support the anchoring, generation, and utilization of allocentric representations.


PLoS Biology ◽  
2017 ◽  
Vol 15 (6) ◽  
pp. e2001878 ◽  
Author(s):  
Stephen M. Town ◽  
W. Owen Brimijoin ◽  
Jennifer K. Bizley

2019 ◽  
Author(s):  
Andrew S. Alexander ◽  
Lucas C. Carstensen ◽  
James R. Hinman ◽  
Florian Raudies ◽  
G. William Chapman ◽  
...  

AbstractThe retrosplenial cortex is reciprocally connected with a majority of structures implicated in spatial cognition and damage to the region itself produces numerous spatial impairments. However, in many ways the retrosplenial cortex remains understudied. Here, we sought to characterize spatial correlates of neurons within the region during free exploration in two-dimensional environments. We report that a large percentage of retrosplenial cortex neurons have spatial receptive fields that are active when environmental boundaries are positioned at a specific orientation and distance relative to the animal itself. We demonstrate that this vector-based location signal is encoded in egocentric coordinates, localized to the dysgranular retrosplenial sub-region, independent of self-motion, and context invariant. Further, we identify a sub-population of neurons with this response property that are synchronized with the hippocampal theta oscillation. Accordingly, the current work identifies a robust egocentric spatial code in retrosplenial cortex that can facilitate spatial coordinate system transformations and support the anchoring, generation, and utilization of allocentric representations.


2002 ◽  
Vol 147 (4) ◽  
pp. 426-436 ◽  
Author(s):  
Carrozzo M. ◽  
Stratta F. ◽  
McIntyre J. ◽  
Lacquaniti F.

Behaviour ◽  
2015 ◽  
Vol 152 (3-4) ◽  
pp. 375-406 ◽  
Author(s):  
Alexandra G. Rosati

Primates must solve complex spatial problems when foraging, such as finding patchy resources and navigating between different locations. However, the nature of the cognitive representations supporting these types of behaviors is currently unclear. In humans, there has been great debate concerning the relative importance of egocentric representations (which are viewer-dependent) versus allocentric representations (which are based on aspects of the external environment). Comparative studies of nonhuman apes can illuminate which aspects of human spatial cognition are shared with other primates, versus which aspects are unique to our lineage. The current studies therefore examined spatial cognitive development in one of our closest living relatives, bonobos (Pan paniscus) across contexts. The first study assessed how younger bonobos encode locations in a place-response task in which apes first learn that one of two locations is consistently baited with a reward, and then must approach the two locations from a flipped perspective. The second study examined how a larger age sample of bonobos responded to a spatial relations task in which they first experience that one location is baited, and then can generalize this learning to a new set of targets. Results indicated that while bonobos exhibited a predominantly allocentric strategy in the first study, they consistently exhibited an egocentric strategy in the second. Together, these results show that bonobos can use both strategies to encode spatial information, and illuminate the complementary contributions to cognition made by egocentric and allocentric representations.


2020 ◽  
pp. 1-18
Author(s):  
Sam C. Berens ◽  
Bárður H. Joensen ◽  
Aidan J. Horner

Scene-selective regions of the human brain form allocentric representations of locations in our environment. These representations are independent of heading direction and allow us to know where we are regardless of our direction of travel. However, we know little about how these location-based representations are formed. Using fMRI representational similarity analysis and linear mixed models, we tracked the emergence of location-based representations in scene-selective brain regions. We estimated patterns of activity for two distinct scenes, taken before and after participants learnt they were from the same location. During a learning phase, we presented participants with two types of panoramic videos: (1) an overlap video condition displaying two distinct scenes (0° and 180°) from the same location and (2) a no-overlap video displaying two distinct scenes from different locations (which served as a control condition). In the parahippocampal cortex (PHC) and retrosplenial cortex (RSC), representations of scenes from the same location became more similar to each other only after they had been shown in the overlap condition, suggesting the emergence of viewpoint-independent location-based representations. Whereas these representations emerged in the PHC regardless of task performance, RSC representations only emerged for locations where participants could behaviorally identify the two scenes as belonging to the same location. The results suggest that we can track the emergence of location-based representations in the PHC and RSC in a single fMRI experiment. Further, they support computational models that propose the RSC plays a key role in transforming viewpoint-independent representations into behaviorally relevant representations of specific viewpoints.


Sign in / Sign up

Export Citation Format

Share Document