Audio-Visual Multisensory Integration in Superior Parietal Lobule Revealed by Human Intracranial Recordings

2006 ◽  
Vol 96 (2) ◽  
pp. 721-729 ◽  
Author(s):  
Sophie Molholm ◽  
Pejman Sehatpour ◽  
Ashesh D. Mehta ◽  
Marina Shpaner ◽  
Manuel Gomez-Ramirez ◽  
...  

Intracranial recordings from three human subjects provide the first direct electrophysiological evidence for audio-visual multisensory processing in the human superior parietal lobule (SPL). Auditory and visual sensory inputs project to the same highly localized region of the parietal cortex with auditory inputs arriving considerably earlier (30 ms) than visual inputs (75 ms). Multisensory integration processes in this region were assessed by comparing the response to simultaneous audio-visual stimulation with the algebraic sum of responses to the constituent auditory and visual unisensory stimulus conditions. Significant integration effects were seen with almost identical morphology across the three subjects, beginning between 120 and 160 ms. These results are discussed in the context of the role of SPL in supramodal spatial attention and sensory-motor transformations.

Cortex ◽  
2021 ◽  
Vol 135 ◽  
pp. 240-254
Author(s):  
A. Banaszkiewicz ◽  
Ł. Bola ◽  
J. Matuszewski ◽  
M. Szczepanik ◽  
B. Kossowski ◽  
...  

2012 ◽  
Vol 71 (4) ◽  
pp. 488-501 ◽  
Author(s):  
Edmund T. Rolls

Complementary neuronal recordings and functional neuroimaging in human subjects show that the primary taste cortex in the anterior insula provides separate and combined representations of the taste, temperature and texture (including fat texture) of food in the mouth independently of hunger and thus of reward value and pleasantness. One synapse on, in the orbitofrontal cortex (OFC), these sensory inputs are for some neurons combined by learning with olfactory and visual inputs, and these neurons encode food reward in that they only respond to food when hungry, and in that activations correlate with subjective pleasantness. Cognitive factors, including word-level descriptions, and attention modulate the representation of the reward value of food in the OFC and a region to which it projects, the anterior cingulate cortex. Further, there are individual differences in the representation of the reward value of food in the OFC. It is argued that over-eating and obesity are related in many cases to an increased reward value of the sensory inputs produced by foods, and their modulation by cognition and attention that over-ride existing satiety signals. It is proposed that control of all rather than one or several of these factors that influence food reward and eating may be important in the prevention and treatment of overeating and obesity.


2004 ◽  
Vol 55 (5) ◽  
pp. 749-751 ◽  
Author(s):  
Olivier Felician ◽  
Patricia Romaiguère ◽  
Jean-Luc Anton ◽  
Bruno Nazarian ◽  
Muriel Roth ◽  
...  

PLoS ONE ◽  
2012 ◽  
Vol 7 (10) ◽  
pp. e46619 ◽  
Author(s):  
Joshua A. Granek ◽  
Laure Pisella ◽  
Annabelle Blangero ◽  
Yves Rossetti ◽  
Lauren E. Sergio

2019 ◽  
Author(s):  
Edgar E. Galindo-Leon ◽  
Iain Stitt ◽  
Florian Pieper ◽  
Thomas Stieglitz ◽  
Gerhard Engler ◽  
...  

AbstractIntrinsically generated patterns of coupled neuronal activity are associated with the dynamics of specific brain states. Sensory inputs are extrinsic factors that can perturb these intrinsic coupling modes, creating a complex scenario in which forthcoming stimuli are processed. Studying this intrinsic-extrinsic interplay is necessary to better understand perceptual integration and selection. Here, we show that this interplay leads to a reconfiguration of functional cortical connectivity that acts as a mechanism to facilitate stimulus processing. Using audiovisual stimulation in anesthetized ferrets, we found that this reconfiguration of coupling modes is context-specific, depending on long-term modulation by repetitive sensory inputs. These reconfigured coupling modes, in turn, lead to changes in latencies and power of local field potential responses that support multisensory integration. Our study demonstrates that this interplay extends across multiple time scales and involves different types of intrinsic coupling. These results suggest a novel large-scale mechanism that facilitates multisensory integration.


2021 ◽  
Vol 12 ◽  
Author(s):  
Hiroaki Mizuhara ◽  
Peter Uhlhaas

The sense of agency is a subjective feeling that one's own actions drive action outcomes. Previous studies have focused primarily on the temporal contingency between actions and sensory inputs as a possible mechanism for the sense of agency. However, the contribution of the integrity of visual inputs has not been systematically addressed. In the current study, we developed a psychophysical task to examine the role of visual inputs as well as temporal contingencies toward the sense of agency. Specifically, participants were required to track a target on a sinusoidal curve on a computer screen. Visual integrity of sensory inputs was manipulated by gradually occluding a computer cursor, and participants were asked to report the sense of agency on a nine-point Likert scale. Temporal contingency was manipulated by varying the delay between finger movements on a touchpad and cursor movements. The results showed that the sense of agency was influenced by both visual integrity and temporal contingency. These results are discussed in the context of current models that have proposed that the sense of agency emerges from the comparison of visual inputs with motor commands.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Thomas Walther ◽  
Nicolas Diekmann ◽  
Sandhiya Vijayabaskaran ◽  
José R. Donoso ◽  
Denise Manahan-Vaughan ◽  
...  

AbstractThe context-dependence of extinction learning has been well studied and requires the hippocampus. However, the underlying neural mechanisms are still poorly understood. Using memory-driven reinforcement learning and deep neural networks, we developed a model that learns to navigate autonomously in biologically realistic virtual reality environments based on raw camera inputs alone. Neither is context represented explicitly in our model, nor is context change signaled. We find that memory-intact agents learn distinct context representations, and develop ABA renewal, whereas memory-impaired agents do not. These findings reproduce the behavior of control and hippocampal animals, respectively. We therefore propose that the role of the hippocampus in the context-dependence of extinction learning might stem from its function in episodic-like memory and not in context-representation per se. We conclude that context-dependence can emerge from raw visual inputs.


2019 ◽  
Vol 5 (4) ◽  
pp. eaar7633 ◽  
Author(s):  
Edgar E. Galindo-Leon ◽  
Iain Stitt ◽  
Florian Pieper ◽  
Thomas Stieglitz ◽  
Gerhard Engler ◽  
...  

Intrinsically generated patterns of coupled neuronal activity are associated with the dynamics of specific brain states. Sensory inputs are extrinsic factors that can perturb these intrinsic coupling modes, creating a complex scenario in which forthcoming stimuli are processed. Studying this intrinsic-extrinsic interplay is necessary to better understand perceptual integration and selection. Here, we show that this interplay leads to a reconfiguration of functional cortical connectivity that acts as a mechanism to facilitate stimulus processing. Using audiovisual stimulation in anesthetized ferrets, we found that this reconfiguration of coupling modes is context specific, depending on long-term modulation by repetitive sensory inputs. These reconfigured coupling modes lead to changes in latencies and power of local field potential responses that support multisensory integration. Our study demonstrates that this interplay extends across multiple time scales and involves different types of intrinsic coupling. These results suggest a previously unknown large-scale mechanism that facilitates multisensory integration.


Author(s):  
Zhaoyang Pang ◽  
Andrea Alamia ◽  
Rufin VanRullen

AbstractTraveling waves have been studied to characterize the complex spatiotemporal dynamics of the brain. Several studies have suggested that the propagation direction of alpha traveling waves can be task-dependent. For example, a recent EEG study from our group found that forward waves (i.e. occipital to frontal, FW waves) were observed during visual processing, whereas backward waves (i.e. frontal to occipital, BW waves) mostly occurred in the absence of sensory input. These EEG recordings, however, were obtained from different experimental sessions and different groups of subjects. To further examine how the waves’ direction changes between task conditions, 13 participants were tested on a target detection task while EEG signals were recorded simultaneously. We alternated visual stimulation (5 s display of visual luminance sequences) and resting state (5 s of black screen) within each single trial, allowing us to monitor the moment-to-moment progression of traveling waves. As expected, the direction of alpha waves was closely linked with task conditions. First, FW waves from occipital to frontal regions, absent during rest, emerged as a result of visual processing, while BW waves in the opposite direction dominated in the absence of visual inputs, and were reduced (but not eliminated) by external visual inputs. Second, during visual stimulation (but not rest), both waves coexisted on average, but were negatively correlated. In summary, we conclude that the functional role of alpha traveling waves is closely related with their propagating direction, with stimulus-evoked FW waves supporting visual processing and spontaneous BW waves involved more in top-down control.


2020 ◽  
Author(s):  
Thomas Walther ◽  
Nicolas Diekmann ◽  
Sandhiya Vijayabaskaran ◽  
José R. Donoso ◽  
Denise Manahan-Vaughan ◽  
...  

AbstractThe context-dependence of extinction learning has been well studied and requires the hippocampus. However, the underlying neural mechanisms are still poorly understood. Using memory-driven reinforcement learning and deep neural networks, we developed a model that learns to navigate autonomously in biologically realistic VR environments based on raw camera inputs alone. Neither is context represented explicitly in our model, nor is context change signaled. We find that memory-intact agents learn distinct context representations, and develop ABA renewal, whereas memory-impaired agents do not. These findings reproduce the behavior of control and hippocampal animals, respectively. We therefore propose that the role of the hippocampus in the context-dependence of extinction learning might stem from its function in episodic-like memory and not in context-representation per se. We conclude that context-dependence can emerge from raw visual inputs.


Sign in / Sign up

Export Citation Format

Share Document