scholarly journals An Investigation in the Role of the Visual Target in Stabilometry.

2000 ◽  
Vol 59 (6) ◽  
pp. 568-573 ◽  
Author(s):  
Keiko Soma ◽  
Takanobu Kunihiro ◽  
Akio Yoshida
Keyword(s):  
10.1038/9219 ◽  
1999 ◽  
Vol 2 (6) ◽  
pp. 563-567 ◽  
Author(s):  
M. Desmurget ◽  
C. M. Epstein ◽  
R. S. Turner ◽  
C. Prablanc ◽  
G. E. Alexander ◽  
...  

1993 ◽  
Vol 70 (4) ◽  
pp. 1578-1584 ◽  
Author(s):  
P. DiZio ◽  
C. E. Lathan ◽  
J. R. Lackner

1. In the oculobrachial illusion, a target light attached to the unseen stationary hand is perceived as moving and changing spatial position when illusory motion of the forearm is elicited by brachial muscle vibration. Our goal was to see whether we could induce apparent motion and displacement of two retinally fixed targets in opposite directions by the use of oculobrachial illusions. 2. We vibrated both biceps brachii, generating illusory movements of the two forearms in opposite directions, and measured any associated changes in perceived distance between target lights on the unseen stationary hands. The stability of visual fixation of one of the targets was also measured. 3. The seen distance between the stationary targets increased significantly when vibration induced an illusory increase in felt distance between the hands, both with binocular and monocular viewing. 4. Subjects maintained fixation accuracy equally well during vibration-induced illusory increases in visual target separation and in a no-vibration control condition. Fixation errors were not correlated with the extent or direction of illusory visual separation. 5. These findings indicate that brachial muscle spindle signals can contribute to an independent representation of felt target location in head-centric coordinates that can be interrelated with a visual representation of target location generated by retinal and oculomotor signals. 6. A model of how these representations are interrelated is proposed, and its relation to other intersensory interactions is discussed.


2010 ◽  
Vol 48 (11) ◽  
pp. 3365-3370 ◽  
Author(s):  
Elkan G. Akyürek ◽  
Angela Dinkelbach ◽  
Anna Schubö ◽  
Hermann J. Müller
Keyword(s):  

2016 ◽  
Vol 9 (5) ◽  
Author(s):  
Aleksandra Kroll ◽  
Monika Mak ◽  
Jerzy Samochowiec

Reaction times are often used as an indicator of the efficiency of the processes in thecentral nervous system. While extensive research has been conducted on the possibleresponse time correlates, the role of eye movements in visual tasks is yet unclear. Here wereport data to support the role of eye movements during visual choice reaction time training.Participant performance, reaction times, and total session duration improved. Eyemovementsshowed expected changes in saccade amplitude and resulted in improvementin visual target searching.


Perception ◽  
1997 ◽  
Vol 26 (1_suppl) ◽  
pp. 291-291
Author(s):  
G I Novikov

The role of subcortical levels—the lateral geniculate body (LGB) and superior colliculus (SC) of cat—in the control of foveation eye movements is described by a model based on our own electrophysiological data. These data include the characteristics of eye movements elicited by local electrical microstimulation of neuronal structures in the LGB and the SC. The model represents a multilevel system forming the program of foveation eye movements by performing the following actions in temporal sequence: determination of the position of the visual target in retinotopic coordinates, determination of its craniotopic coordinates and determination of the direction and calculation of the velocity of the moving visual target. I discuss algorithms and neuronal mechanisms (including electrophysiological data on single neurons and neuronal populations) of subcortical levels of the cat visual system taking part in foveation eye-movement control for stationary and moving visual objects, as well as the role of directional and orientation properties of receptive fields of subcortical neurons in this control.


Perception ◽  
10.1068/p3126 ◽  
2001 ◽  
Vol 30 (7) ◽  
pp. 795-810 ◽  
Author(s):  
Melanie C Doyle ◽  
Robert J Snowden

Can auditory signals influence the processing of visual information? The present study examined the effects of simple auditory signals (clicks and noise bursts) whose onset was simultaneous with that of the visual target, but which provided no information about the target. It was found that such a signal enhances performance in the visual task: the accessory sound reduced response times for target identification with no cost to accuracy. The spatial location of the sound (whether central to the display or at the target location) did not modify this facilitation. Furthermore, the same pattern of facilitation was evident whether the observer fixated centrally or moved their eyes to the target. The results were not altered by changes in the contrast (and therefore visibility) of the visual stimulus or by the perceived utility of the spatial location of the sound. We speculate that the auditory signal may promote attentional ‘disengagement’ and that, as a result, observers are able to process the visual target sooner when sound accompanies the display relative to when visual information is presented alone.


2016 ◽  
Vol 116 (6) ◽  
pp. 2586-2593 ◽  
Author(s):  
Jing Chen ◽  
Matteo Valsecchi ◽  
Karl R. Gegenfurtner

When human observers track the movements of their own hand with their gaze, the eyes can start moving before the finger (i.e., anticipatory smooth pursuit). The signals driving anticipation could come from motor commands during finger motor execution or from motor intention and decision processes associated with self-initiated movements. For the present study, we built a mechanical device that could move a visual target either in the same direction as the participant's hand or in the opposite direction. Gaze pursuit of the target showed stronger anticipation if it moved in the same direction as the hand compared with the opposite direction, as evidenced by decreased pursuit latency, increased positional lead of the eye relative to target, increased pursuit gain, decreased saccade rate, and decreased delay at the movement reversal. Some degree of anticipation occurred for incongruent pursuit, indicating that there is a role for higher-level movement prediction in pursuit anticipation. The fact that anticipation was larger when target and finger moved in the same direction provides evidence for a direct coupling between finger and eye motor commands.


Vision ◽  
2019 ◽  
Vol 3 (4) ◽  
pp. 49
Author(s):  
Sabine Born

Across saccades, small displacements of a visual target are harder to detect and their directions more difficult to discriminate than during steady fixation. Prominent theories of this effect, known as saccadic suppression of displacement, propose that it is due to a bias to assume object stability across saccades. Recent studies comparing the saccadic effect to masking effects suggest that suppression of displacement is not saccade-specific. Further evidence for this account is presented from two experiments where participants judged the size of displacements on a continuous scale in saccade and mask conditions, with and without blanking. Saccades and masks both reduced the proportion of correctly perceived displacements and increased the proportion of missed displacements. Blanking improved performance in both conditions by reducing the proportion of missed displacements. Thus, if suppression of displacement reflects a bias for stability, it is not a saccade-specific bias, but a more general stability assumption revealed under conditions of impoverished vision. Specifically, I discuss the potentially decisive role of motion or other transient signals for displacement perception. Without transients or motion, the quality of relative position signals is poor, and saccadic and mask-induced suppression of displacement reflects performance when the decision has to be made on these signals alone. Blanking may improve those position signals by providing a transient onset or a longer time to encode the pre-saccadic target position.


1998 ◽  
Vol 87 (1) ◽  
pp. 3-18 ◽  
Author(s):  
Dawn G. Blasko ◽  
Michael D. Hall

Three experiments investigated the role of prosody in the comprehension of auditory sentences. In Exp. 1 an analysis of three novice talkers and one expert talker verified the production parameters of one type of syntactic ambiguity and showed that pitch cues were more prominent than duration cues. In Exp. 2, 16 listeners used prosodic information to make consistent decisions reliably about phrase boundaries. In Exp. 3, 40 participants listened to sentences in which prosody was inconsistent with later morphosyntactic information, indicated their understanding, and then judged whether a visual target was related to the meaning of the sentence. Inconsistent prosody slowed comprehension and contributed to slower, less accurate judgments of sentence meaning. This suggests that prosodic information contributes to the perception of spoken language and can affect comprehension even when the syntactic structure indicated by prosody is contradicted by subsequent morphosyntactic information.


Sign in / Sign up

Export Citation Format

Share Document