Evidence for Distinct, Differentially Adaptable Sensorimotor Transformations for Reaches to Visual and Proprioceptive Targets

2007 ◽  
Vol 98 (3) ◽  
pp. 1815-1819 ◽  
Author(s):  
Pierre-Michel Bernier ◽  
Gabriel M. Gauthier ◽  
Jean Blouin

Recent evidence suggests that planning a reaching movement entails similar stages and common networks irrespective of whether the target location is defined through visual or proprioceptive cues. Here we test whether the transformations that convert the sensory information regarding target location into the required motor output are common for both types of reaches. To do so, we adaptively modified these sensorimotor transformations through exposure to displacing prisms and hypothesized that if they are common to both types of reaches, the aftereffects observed for reaches to visual targets would generalize to reaches to a proprioceptive target. Subjects ( n = 16) were divided into two groups that differed with respect to the sensory modality of the targets (visual or proprioceptive) used in the pre- and posttests. The adaptation phase was identical for both groups and consisted of movements toward visual targets while wearing 10.5° horizontally displacing prisms. We observed large aftereffects consistent with the magnitude of the prism-induced shift when reaching toward visual targets in the posttest, but no significant aftereffects for movements toward the proprioceptive target. These results provide evidence that distinct, differentially adaptable sensorimotor transformations underlie the planning of reaches to visual and proprioceptive targets.

2021 ◽  
Author(s):  
Muzahid Islam ◽  
Sudhakar Deeti ◽  
Zakia Mahmudah ◽  
J. Frances Kamhi ◽  
Ken Cheng

ABSTRACTMany animals navigate in a structurally complex environment which requires them to detour around physical barriers that they encounter. While many studies in animal cognition suggest that they are able to adeptly avoid obstacles, it is unclear whether a new route is learned to navigate around these barriers and, if so, what sensory information may be used to do so. We investigated detour learning ability in the Australian bull ant, Myrmecia midas, which primarily uses visual landmarks to navigate. We first placed a barrier on the ants’ natural path of their foraging tree. Initially, 46% of foragers were unsuccessful in detouring the obstacle. In subsequent trips, the ants became more successful and established a new route. We observed up to eight successful foraging trips detouring around the barrier. When we subsequently changed the position of the barrier, made a new gap in the middle of the obstacle, or removed the barrier altogether, ants mostly maintained their learned motor routine, detouring with a similar path as before, suggesting that foragers were not relying on barrier cues and therefore learned a new route around the obstacle. In additional trials, when foragers encountered new olfactory or tactile cues, or the visual environment was blocked, their navigation was profoundly disrupted. These results suggest that changing sensory information, even in modalities that foragers do not usually need for navigation, drastically affects the foragers’ ability to successful navigate.Subject CategoryNeuroscience and Cognition


2018 ◽  
Vol 119 (5) ◽  
pp. 1981-1992 ◽  
Author(s):  
Laura Mikula ◽  
Valérie Gaveau ◽  
Laure Pisella ◽  
Aarlenne Z. Khan ◽  
Gunnar Blohm

When reaching to an object, information about the target location as well as the initial hand position is required to program the motor plan for the arm. The initial hand position can be determined by proprioceptive information as well as visual information, if available. Bayes-optimal integration posits that we utilize all information available, with greater weighting on the sense that is more reliable, thus generally weighting visual information more than the usually less reliable proprioceptive information. The criterion by which information is weighted has not been explicitly investigated; it has been assumed that the weights are based on task- and effector-dependent sensory reliability requiring an explicit neuronal representation of variability. However, the weights could also be determined implicitly through learned modality-specific integration weights and not on effector-dependent reliability. While the former hypothesis predicts different proprioceptive weights for left and right hands, e.g., due to different reliabilities of dominant vs. nondominant hand proprioception, we would expect the same integration weights if the latter hypothesis was true. We found that the proprioceptive weights for the left and right hands were extremely consistent regardless of differences in sensory variability for the two hands as measured in two separate complementary tasks. Thus we propose that proprioceptive weights during reaching are learned across both hands, with high interindividual range but independent of each hand’s specific proprioceptive variability. NEW & NOTEWORTHY How visual and proprioceptive information about the hand are integrated to plan a reaching movement is still debated. The goal of this study was to clarify how the weights assigned to vision and proprioception during multisensory integration are determined. We found evidence that the integration weights are modality specific rather than based on the sensory reliabilities of the effectors.


2018 ◽  
Vol 5 (2) ◽  
pp. 171785 ◽  
Author(s):  
Martin F. Strube-Bloss ◽  
Wolfgang Rössler

Flowers attract pollinating insects like honeybees by sophisticated compositions of olfactory and visual cues. Using honeybees as a model to study olfactory–visual integration at the neuronal level, we focused on mushroom body (MB) output neurons (MBON). From a neuronal circuit perspective, MBONs represent a prominent level of sensory-modality convergence in the insect brain. We established an experimental design allowing electrophysiological characterization of olfactory, visual, as well as olfactory–visual induced activation of individual MBONs. Despite the obvious convergence of olfactory and visual pathways in the MB, we found numerous unimodal MBONs. However, a substantial proportion of MBONs (32%) responded to both modalities and thus integrated olfactory–visual information across MB input layers. In these neurons, representation of the olfactory–visual compound was significantly increased compared with that of single components, suggesting an additive, but nonlinear integration. Population analyses of olfactory–visual MBONs revealed three categories: (i) olfactory, (ii) visual and (iii) olfactory–visual compound stimuli. Interestingly, no significant differentiation was apparent regarding different stimulus qualities within these categories. We conclude that encoding of stimulus quality within a modality is largely completed at the level of MB input, and information at the MB output is integrated across modalities to efficiently categorize sensory information for downstream behavioural decision processing.


2020 ◽  
Vol 117 (39) ◽  
pp. 24590-24598
Author(s):  
Freek van Ede ◽  
Alexander G. Board ◽  
Anna C. Nobre

Adaptive behavior relies on the selection of relevant sensory information from both the external environment and internal memory representations. In understanding external selection, a classic distinction is made between voluntary (goal-directed) and involuntary (stimulus-driven) guidance of attention. We have developed a task—the anti-retrocue task—to separate and examine voluntary and involuntary guidance of attention to internal representations in visual working memory. We show that both voluntary and involuntary factors influence memory performance but do so in distinct ways. Moreover, by tracking gaze biases linked to attentional focusing in memory, we provide direct evidence for an involuntary “retro-capture” effect whereby external stimuli involuntarily trigger the selection of feature-matching internal representations. We show that stimulus-driven and goal-directed influences compete for selection in memory, and that the balance of this competition—as reflected in oculomotor signatures of internal attention—predicts the quality of ensuing memory-guided behavior. Thus, goal-directed and stimulus-driven factors together determine the fate not only of perception, but also of internal representations in working memory.


2014 ◽  
Vol 112 (9) ◽  
pp. 2290-2301 ◽  
Author(s):  
Jean Blouin ◽  
Anahid H. Saradjian ◽  
Nicolas Lebar ◽  
Alain Guillaume ◽  
Laurence Mouchnino

Behavioral studies have suggested that the brain uses a visual estimate of the hand to plan reaching movements toward visual targets and somatosensory inputs in the case of somatosensory targets. However, neural correlates for distinct coding of the hand according to the sensory modality of the target have not yet been identified. Here we tested the twofold hypothesis that the somatosensory input from the reaching hand is facilitated and inhibited, respectively, when planning movements toward somatosensory (unseen fingers) or visual targets. The weight of the somatosensory inputs was assessed by measuring the amplitude of the somatosensory evoked potential (SEP) resulting from vibration of the reaching finger during movement planning. The target sensory modality had no significant effect on SEP amplitude. However, Spearman's analyses showed significant correlations between the SEPs and reaching errors. When planning movements toward proprioceptive targets without visual feedback of the reaching hand, participants showing the greater SEPs were those who produced the smaller directional errors. Inversely, participants showing the smaller SEPs when planning movements toward visual targets with visual feedback of the reaching hand were those who produced the smaller directional errors. No significant correlation was found between the SEPs and radial or amplitude errors. Our results indicate that the sensory strategy for planning movements is highly flexible among individuals and also for a given sensory context. Most importantly, they provide neural bases for the suggestion that optimization of movement planning requires the target and the reaching hand to both be represented in the same sensory modality.


2002 ◽  
Vol 19 (4) ◽  
pp. 207-219 ◽  
Author(s):  
Eynat Gal ◽  
Murray Dyck ◽  
Anne Passmore

AbstractThis study was designed to test whether there is a functional relationship between sensory stimulation and stereotyped movements (SM). Four children with autism and intellectual disability (according to DSM-IV criteria) who showed stereotyped movements were studied. The Short Sensory Profile was used to define whether a child perceived stimulation within each sensory modality as aversive, attractive, or neutral. The Stereotyped and Self-Injurious Movements Interview was used to identify each child's repetitive movements. Children were then exposed to sensory stimuli that were neutral, aversive or attractive. Results indicate that children: (a) initiate or increase stereotyped movements immediately following the onset of an aversive stimulus, (b) terminate or decrease stereotyped movements following the onset of an attractive stimulus and (c) initiate or increase stereotyped movements during periods of neutral stimulation. We conclude that stereotyped movements are functionally related to sensory stimulation; individuals who frequently engage in stereotyped movements may do so in order to cope with under-stimulation and aversive over-stimulation.


2016 ◽  
Vol 115 (6) ◽  
pp. 3162-3173 ◽  
Author(s):  
Valeria C. Caruso ◽  
Daniel S. Pages ◽  
Marc A. Sommer ◽  
Jennifer M. Groh

Saccadic eye movements can be elicited by more than one type of sensory stimulus. This implies substantial transformations of signals originating in different sense organs as they reach a common motor output pathway. In this study, we compared the prevalence and magnitude of auditory- and visually evoked activity in a structure implicated in oculomotor processing, the primate frontal eye fields (FEF). We recorded from 324 single neurons while 2 monkeys performed delayed saccades to visual or auditory targets. We found that 64% of FEF neurons were active on presentation of auditory targets and 87% were active during auditory-guided saccades, compared with 75 and 84% for visual targets and saccades. As saccade onset approached, the average level of population activity in the FEF became indistinguishable on visual and auditory trials. FEF activity was better correlated with the movement vector than with the target location for both modalities. In summary, the large proportion of auditory-responsive neurons in the FEF, the similarity between visual and auditory activity levels at the time of the saccade, and the strong correlation between the activity and the saccade vector suggest that auditory signals undergo tailoring to match roughly the strength of visual signals present in the FEF, facilitating accessing of a common motor output pathway.


2019 ◽  
Author(s):  
Michael J. Crosse ◽  
John J. Foxe ◽  
Sophie Molholm

AbstractChildren with autism spectrum disorder (ASD) are often impaired in their ability to cope with and process multisensory information, which may contribute to some of the social and communicative deficits that are prevalent in this population. Amelioration of such deficits in adolescence has been observed for ecologically-relevant stimuli such as speech. However, it is not yet known if this recovery generalizes to the processing of nonsocial stimuli such as more basic beeps and flashes, typically used in cognitive neuroscience research. We hypothesize that engagement of different neural processes and lack of environmental exposure to such artificial stimuli leads to protracted developmental trajectories in both neurotypical (NT) individuals and individuals with ASD, thus delaying the age at which we observe this “catch up”. Here, we test this hypothesis using a bisensory detection task by measuring human response times to randomly presented auditory, visual and audiovisual stimuli. By measuring the behavioral gain afforded by an audiovisual signal, we show that the multisensory deficit previously reported in children with ASD recovers in adulthood by the mid-twenties. In addition, we examine the effects of switching between sensory modalities and show that teenagers with ASD incur less of a behavioral cost than their NT peers. Computational modelling reveals that multisensory information interacts according to different rules in children and adults, and that sensory evidence is weighted differently too. In ASD, weighting of sensory information and allocation of attention during multisensory processing differs to that of NT individuals. Based on our findings, we propose a theoretical framework of multisensory development in NT and ASD individuals.


Author(s):  
Léa Caya-Bissonnette

The underlying processes allowing for decision-making has been a question of interest for many neuroscientists. The lateral intraparietal cortex, or LIP, was shown to accumulate context and sensory information to compute a decision variable. The following review will present the work of Kumano, Suda and Uka who studied the link between context and sensory information during decision-making. To do so, a monkey was trained to associate the color of a fixating dot to one of two tasks. The tasks consisted in either indicating the motion or the depth of themajority of the dots on a screen. The local field potential of the LIP neurons was recorded, and the researchers found a role of context during the stimulus presentation in regards to decision formation. The results have important implication for mental disorders involving malfunction in decision processes.


2018 ◽  
Author(s):  
Archana Venkataraman ◽  
Natalia Brody ◽  
Preethi Reddi ◽  
Jidong Guo ◽  
Donald Rainnie ◽  
...  

Fear expressed towards threat-associated stimuli is an adaptive behavioral response. In contrast, the generalization of fear responses toward non-threatening cues is maladaptive and a debilitating dimension of trauma- and anxiety-related disorders. Expressing fear to appropriate stimuli and suppressing fear generalization requires integration of relevant sensory information and motor output. While thalamic and sub-thalamic brain regions play important roles in sensorimotor integration, very little is known about the contribution of these regions to the phenomenon of fear generalization. In this study, we sought to determine whether fear generalization could be modulated by the zona incerta (ZI), a sub-thalamic brain region that influences sensory discrimination, defensive responses, and retrieval of fear memories. To do so, we combined differential intensity-based auditory fear conditioning protocols in mice with C-FOS immunohistochemistry and DREADD-based manipulation of neuronal activity in the ZI. C-FOS immunohistochemistry revealed an inverse relationship between ZI activation and fear generalization with the ZI being less active in animals that generalized fear. In agreement with this relationship, chemogenetic inhibition of the ZI resulted in fear generalization, while chemogenetic activation of the ZI suppressed fear generalization. Furthermore, targeted stimulation of GABAergic cells in the ZI reduced fear generalization. To conclude, our data suggest that stimulation of the ZI could be used to treat fear generalization in the context of trauma- and anxiety-related disorders.


Sign in / Sign up

Export Citation Format

Share Document