scholarly journals Spatiotopic perceptual maps in humans: evidence from motion adaptation

2012 ◽  
Vol 279 (1740) ◽  
pp. 3091-3097 ◽  
Author(s):  
Marco Turi ◽  
David Burr

How our perceptual experience of the world remains stable and continuous despite the frequent repositioning eye movements remains very much a mystery. One possibility is that our brain actively constructs a spatiotopic representation of the world, which is anchored in external—or at least head-centred—coordinates. In this study, we show that the positional motion aftereffect (the change in apparent position after adaptation to motion) is spatially selective in external rather than retinal coordinates, whereas the classic motion aftereffect (the illusion of motion after prolonged inspection of a moving source) is selective in retinotopic coordinates. The results provide clear evidence for a spatiotopic map in humans: one which can be influenced by image motion.

2003 ◽  
Vol 65 (7) ◽  
pp. 1011-1018 ◽  
Author(s):  
David Whitney ◽  
Patrick Cavanagh

2011 ◽  
Vol 366 (1564) ◽  
pp. 504-515 ◽  
Author(s):  
David C. Burr ◽  
Maria Concetta Morrone

How our perceptual experience of the world remains stable and continuous in the face of continuous rapid eye movements still remains a mystery. This review discusses some recent progress towards understanding the neural and psychophysical processes that accompany these eye movements. We firstly report recent evidence from imaging studies in humans showing that many brain regions are tuned in spatiotopic coordinates, but only for items that are actively attended. We then describe a series of experiments measuring the spatial and temporal phenomena that occur around the time of saccades, and discuss how these could be related to visual stability. Finally, we introduce the concept of the spatio-temporal receptive field to describe the local spatiotopicity exhibited by many neurons when the eyes move.


2018 ◽  
Author(s):  
David Barner

Why did humans develop precise systems for measuring experience, like numbers, clocks, andcalendars? I argue that precise representational systems were constructed by earlier generationsof humans because they recognized that their noisy perceptual systems were not capturingdistinctions that existed in the world. Abstract symbolic systems did not arise from perceptualrepresentations, but instead were constructed to describe and explain perceptual experience. Byanalogy, I argue that when children learn number words, they do not rely on noisy perceptualsystems, but instead acquire these words as units in a broader system of procedures, whosemeanings are ultimately defined by logical relations to one another, not perception.


Designs ◽  
2021 ◽  
Vol 5 (1) ◽  
pp. 8
Author(s):  
Pyrrhon Amathes ◽  
Paul Christodoulides

Photography can be used for pleasure and art but can also be used in many disciplines of science, because it captures the details of the moment and can serve as a proving tool due to the information it preserves. During the period of the Apollo program (1969 to 1972), the National Aeronautics and Space Administration (NASA) successfully landed humans on the Moon and showed hundreds of photos to the world presenting the travel and landings. This paper uses computer simulations and geometry to examine the authenticity of one such photo, namely Apollo 17 photo GPN-2000-00113. In addition, a novel approach is employed by creating an experimental scene to illustrate details and provide measurements. The crucial factors on which the geometrical analysis relies are locked in the photograph and are: (a) the apparent position of the Earth relative to the illustrated flag and (b) the point to which the shadow of the astronaut taking the photo reaches, in relation to the flagpole. The analysis and experimental data show geometrical and time mismatches, proving that the photo is a composite.


Author(s):  
James Deery

AbstractFor some, the states and processes involved in the realisation of phenomenal consciousness are not confined to within the organismic boundaries of the experiencing subject. Instead, the sub-personal basis of perceptual experience can, and does, extend beyond the brain and body to implicate environmental elements through one’s interaction with the world. These claims are met by proponents of predictive processing, who propose that perception and imagination should be understood as a product of the same internal mechanisms. On this view, as visually imagining is not considered to be world-involving, it is assumed that world-involvement must not be essential for perception, and thus internalism about the sub-personal basis is true. However, the argument for internalism from the unity of perception and imagination relies for its strength on a questionable conception of the relationship between the two experiential states. I argue that proponents of the predictive approach are guilty of harbouring an implicit commitment to the common kind assumption which does not follow trivially from their framework. That is, the assumption that perception and imagination are of the same fundamental kind of mental event. I will argue that there are plausible alternative ways of conceiving of this relationship without drawing internalist metaphysical conclusions from their psychological theory. Thus, the internalist owes the debate clarification of this relationship and further argumentation to secure their position.


2015 ◽  
Vol 114 (5) ◽  
pp. 2637-2648 ◽  
Author(s):  
Fabrice Arcizet ◽  
Koorosh Mirpour ◽  
Daniel J. Foster ◽  
Caroline J. Charpentier ◽  
James W. Bisley

When looking around at the world, we can only attend to a limited number of locations. The lateral intraparietal area (LIP) is thought to play a role in guiding both covert attention and eye movements. In this study, we tested the involvement of LIP in both mechanisms with a change detection task. In the task, animals had to indicate whether an element changed during a blank in the trial by making a saccade to it. If no element changed, they had to maintain fixation. We examine how the animal's behavior is biased based on LIP activity prior to the presentation of the stimulus the animal must respond to. When the activity was high, the animal was more likely to make an eye movement toward the stimulus, even if there was no change; when the activity was low, the animal either had a slower reaction time or maintained fixation, even if a change occurred. We conclude that LIP activity is involved in both covert and overt attention, but when decisions about eye movements are to be made, this role takes precedence over guiding covert attention.


10.1167/7.6.9 ◽  
2007 ◽  
Vol 7 (6) ◽  
pp. 9 ◽  
Author(s):  
Lore Thaler ◽  
James T. Todd ◽  
Miriam Spering ◽  
Karl R. Gegenfurtner

2006 ◽  
Vol 16 (1-2) ◽  
pp. 1-22 ◽  
Author(s):  
Junko Fukushima ◽  
Teppei Akao ◽  
Sergei Kurkin ◽  
Chris R.S. Kaneko ◽  
Kikuro Fukushima

In order to see clearly when a target is moving slowly, primates with high acuity foveae use smooth-pursuit and vergence eye movements. The former rotates both eyes in the same direction to track target motion in frontal planes, while the latter rotates left and right eyes in opposite directions to track target motion in depth. Together, these two systems pursue targets precisely and maintain their images on the foveae of both eyes. During head movements, both systems must interact with the vestibular system to minimize slip of the retinal images. The primate frontal cortex contains two pursuit-related areas; the caudal part of the frontal eye fields (FEF) and supplementary eye fields (SEF). Evoked potential studies have demonstrated vestibular projections to both areas and pursuit neurons in both areas respond to vestibular stimulation. The majority of FEF pursuit neurons code parameters of pursuit such as pursuit and vergence eye velocity, gaze velocity, and retinal image motion for target velocity in frontal and depth planes. Moreover, vestibular inputs contribute to the predictive pursuit responses of FEF neurons. In contrast, the majority of SEF pursuit neurons do not code pursuit metrics and many SEF neurons are reported to be active in more complex tasks. These results suggest that FEF- and SEF-pursuit neurons are involved in different aspects of vestibular-pursuit interactions and that eye velocity coding of SEF pursuit neurons is specialized for the task condition.


1997 ◽  
Vol 17 (20) ◽  
pp. 7941-7953 ◽  
Author(s):  
M. Concetta Morrone ◽  
John Ross ◽  
David C. Burr

Sign in / Sign up

Export Citation Format

Share Document