scholarly journals Task-dependent recurrent dynamics in visual cortex

2017 ◽  
Author(s):  
Satohiro Tajima ◽  
Kowa Koida ◽  
Chihiro I. Tajima ◽  
Hideyuki Suzuki ◽  
Kazuyuki Aihara ◽  
...  

AbstractThe capacity for flexible sensory-action association in animals has been related to context-dependent attractor dynamics outside the sensory cortices. Here we report a line of evidence that flexibly modulated attractor dynamics during task switching are already present in the higher visual cortex in macaque monkeys. With a nonlinear decoding approach, we can extract the particular aspect of the neural population response that reflects the task-induced emergence of bistable attractor dynamics in a neural population, which could be obscured by standard unsupervised dimensionality reductions such as PCA. The dynamical modulation selectively increases the information relevant to task demands, indicating that such modulation is beneficial for perceptual decisions. A computational model that features nonlinear recurrent interaction among neurons with a task-dependent background input replicates the key properties observed in the experimental data. These results suggest that the context-dependent attractor dynamics involving the sensory cortex can underlie flexible perceptual abilities.

eLife ◽  
2017 ◽  
Vol 6 ◽  
Author(s):  
Satohiro Tajima ◽  
Kowa Koida ◽  
Chihiro I Tajima ◽  
Hideyuki Suzuki ◽  
Kazuyuki Aihara ◽  
...  

The capacity for flexible sensory-action association in animals has been related to context-dependent attractor dynamics outside the sensory cortices. Here, we report a line of evidence that flexibly modulated attractor dynamics during task switching are already present in the higher visual cortex in macaque monkeys. With a nonlinear decoding approach, we can extract the particular aspect of the neural population response that reflects the task-induced emergence of bistable attractor dynamics in a neural population, which could be obscured by standard unsupervised dimensionality reductions such as PCA. The dynamical modulation selectively increases the information relevant to task demands, indicating that such modulation is beneficial for perceptual decisions. A computational model that features nonlinear recurrent interaction among neurons with a task-dependent background input replicates the key properties observed in the experimental data. These results suggest that the context-dependent attractor dynamics involving the sensory cortex can underlie flexible perceptual abilities.


2018 ◽  
Author(s):  
Andreea Lazar ◽  
Chris Lewis ◽  
Pascal Fries ◽  
Wolf Singer ◽  
Danko Nikolić

SummarySensory exposure alters the response properties of individual neurons in primary sensory cortices. However, it remains unclear how these changes affect stimulus encoding by populations of sensory cells. Here, recording from populations of neurons in cat primary visual cortex, we demonstrate that visual exposure enhances stimulus encoding and discrimination. We find that repeated presentation of brief, high-contrast shapes results in a stereotyped, biphasic population response consisting of a short-latency transient, followed by a late and extended period of reverberatory activity. Visual exposure selectively improves the stimulus specificity of the reverberatory activity, by increasing the magnitude and decreasing the trial-to-trial variability of the neuronal response. Critically, this improved stimulus encoding is distributed across the population and depends on precise temporal coordination. Our findings provide evidence for the existence of an exposure-driven optimization process that enhances the encoding power of neuronal populations in early visual cortex, thus potentially benefiting simple readouts at higher stages of visual processing.


eLife ◽  
2016 ◽  
Vol 5 ◽  
Author(s):  
Roberto Vincis ◽  
Alfredo Fontanini

A growing body of literature has demonstrated that primary sensory cortices are not exclusively unimodal, but can respond to stimuli of different sensory modalities. However, several questions concerning the neural representation of cross-modal stimuli remain open. Indeed, it is poorly understood if cross-modal stimuli evoke unique or overlapping representations in a primary sensory cortex and whether learning can modulate these representations. Here we recorded single unit responses to auditory, visual, somatosensory, and olfactory stimuli in the gustatory cortex (GC) of alert rats before and after associative learning. We found that, in untrained rats, the majority of GC neurons were modulated by a single modality. Upon learning, both prevalence of cross-modal responsive neurons and their breadth of tuning increased, leading to a greater overlap of representations. Altogether, our results show that the gustatory cortex represents cross-modal stimuli according to their sensory identity, and that learning changes the overlap of cross-modal representations.


2020 ◽  
Vol 30 (8) ◽  
pp. 4662-4676
Author(s):  
Kevin J Monk ◽  
Simon Allard ◽  
Marshall G Hussain Shuler

Abstract The primary sensory cortex has historically been studied as a low-level feature detector, but has more recently been implicated in many higher-level cognitive functions. For instance, after an animal learns that a light predicts water at a fixed delay, neurons in the primary visual cortex (V1) can produce “reward timing activity” (i.e., spike modulation of various forms that relate the interval between the visual stimulus and expected reward). Local manipulations to V1 implicate it as a site of learning reward timing activity (as opposed to simply reporting timing information from another region via feedback input). However, the manner by which V1 then produces these representations is unknown. Here, we combine behavior, in vivo electrophysiology, and optogenetics to investigate the characteristics of and circuit mechanisms underlying V1 reward timing in the head-fixed mouse. We find that reward timing activity is present in mouse V1, that inhibitory interneurons participate in reward timing, and that these representations are consistent with a theorized network architecture. Together, these results deepen our understanding of V1 reward timing and the manner by which it is produced.


2012 ◽  
Vol 25 (0) ◽  
pp. 198
Author(s):  
Manuel R. Mercier ◽  
John J. Foxe ◽  
Ian C. Fiebelkorn ◽  
John S. Butler ◽  
Theodore H. Schwartz ◽  
...  

Investigations have traditionally focused on activity in the sensory cortices as a function of their respective sensory inputs. However, converging evidence from multisensory research has shown that neural activity in a given sensory region can be modulated by stimulation of other so-called ancillary sensory systems. Both electrophysiology and functional imaging support the occurrence of multisensory processing in human sensory cortex based on the latency of multisensory effects and their precise anatomical localization. Still, due to inherent methodological limitations, direct evidence of the precise mechanisms by which multisensory integration occurs within human sensory cortices is lacking. Using intracranial recordings in epileptic patients () undergoing presurgical evaluation, we investigated the neurophysiological basis of multisensory integration in visual cortex. Subdural electrical brain activity was recorded while patients performed a simple detection task of randomly ordered Auditory alone (A), Visual alone (V) and Audio–Visual stimuli (AV). We then performed time-frequency analysis: first we investigated each condition separately to evaluate responses compared to baseline, then we indexed multisensory integration using both the maximum criterion model (AV vs. V) and the additive model (AV vs. A+V). Our results show that auditory input significantly modulates neuronal activity in visual cortex by resetting the phase of ongoing oscillatory activity. This in turn leads to multisensory integration when auditory and visual stimuli are simultaneously presented.


eLife ◽  
2016 ◽  
Vol 5 ◽  
Author(s):  
Janelle MP Pakan ◽  
Scott C Lowe ◽  
Evelyn Dylda ◽  
Sander W Keemink ◽  
Stephen P Currie ◽  
...  

Cortical responses to sensory stimuli are modulated by behavioral state. In the primary visual cortex (V1), visual responses of pyramidal neurons increase during locomotion. This response gain was suggested to be mediated through inhibitory neurons, resulting in the disinhibition of pyramidal neurons. Using in vivo two-photon calcium imaging in layers 2/3 and 4 in mouse V1, we reveal that locomotion increases the activity of vasoactive intestinal peptide (VIP), somatostatin (SST) and parvalbumin (PV)-positive interneurons during visual stimulation, challenging the disinhibition model. In darkness, while most VIP and PV neurons remained locomotion responsive, SST and excitatory neurons were largely non-responsive. Context-dependent locomotion responses were found in each cell type, with the highest proportion among SST neurons. These findings establish that modulation of neuronal activity by locomotion is context-dependent and contest the generality of a disinhibitory circuit for gain control of sensory responses by behavioral state.


2017 ◽  
Author(s):  
Maria C. Dadarlat ◽  
Michael P. Stryker

AbstractNeurons in mouse primary visual cortex (V1) are selective for particular properties of visual stimuli. Locomotion causes a change in cortical state that leaves their selectivity unchanged but strengthens their responses. Both locomotion and the change in cortical state are initiated by projections from the mesencephalic locomotor region (MLR), the latter through a disinhibitory circuit in V1. The function served by this change in cortical state is unknown. By recording simultaneously from a large number of single neurons in alert mice viewing moving gratings, we investigated the relationship between locomotion and the information contained within the neural population. We found that locomotion improved encoding of visual stimuli in V1 by two mechanisms. First, locomotion-induced increases in firing rates enhanced the mutual information between visual stimuli and single neuron responses over a fixed window of time. Second, stimulus discriminability was improved, even for fixed population firing rates, because of a decrease in noise correlations across the population during locomotion. These two mechanisms contributed differently to improvements in discriminability across cortical layers, with changes in firing rates most important in the upper layers and changes in noise correlations most important in layer V. Together, these changes resulted in a three- to five-fold reduction in the time needed to precisely encode grating direction and orientation. These results support the hypothesis that cortical state shifts during locomotion to accommodate an increased load on the visual system when mice are moving.Significance StatementThis paper contains three novel findings about the representation of information in neurons within the primary visual cortex of the mouse. First, we show that locomotion reduces by at least a factor of three the time needed for information to accumulate in the visual cortex that allows the distinction of different visual stimuli. Second, we show that the effect of locomotion is to increase information in cells of all layers of the visual cortex. Third we show that the means by which information is enhanced by locomotion differs between the upper layers, where the major effect is the increasing of firing rates, and in layer V, where the major effect is the reduction in noise correlations.


Sign in / Sign up

Export Citation Format

Share Document