scholarly journals Eye-movement reinstatement and neural reactivation during mental imagery

2017 ◽  
Author(s):  
Michael B. Bone ◽  
Marie St-Laurent ◽  
Christa Dang ◽  
Douglas A. McQuiggan ◽  
Jennifer D. Ryan ◽  
...  

AbstractHalf a century ago, Donald Hebb posited that mental imagery is a constructive process that emulates perception. Specifically, Hebb claimed that visual imagery results from the reactivation of neural activity associated with viewing images. He also argued that neural reactivation and imagery benefit from the re-enactment of eye movement patterns that first occurred at viewing (fixation reinstatement). To investigate these claims, we applied multivariate pattern analyses to functional MRI (fMRI) and eye-tracking data collected while healthy human participants repeatedly viewed and visualized complex images. We observed that the specificity of neural reactivation correlated positively with vivid imagery and with memory for stimulus image details. Moreover, neural reactivation correlated positively with fixation reinstatement, meaning that image-specific eye movements accompanied image-specific patterns of brain activity during visualization. These findings support the conception of mental imagery as a simulation of perception, and provide evidence of the supportive role of eye-movement in neural reactivation.

2020 ◽  
Vol 32 (3) ◽  
pp. 527-545 ◽  
Author(s):  
Peter Kok ◽  
Lindsay I. Rait ◽  
Nicholas B. Turk-Browne

Recent work suggests that a key function of the hippocampus is to predict the future. This is thought to depend on its ability to bind inputs over time and space and to retrieve upcoming or missing inputs based on partial cues. In line with this, previous research has revealed prediction-related signals in the hippocampus for complex visual objects, such as fractals and abstract shapes. Implicit in such accounts is that these computations in the hippocampus reflect domain-general processes that apply across different types and modalities of stimuli. An alternative is that the hippocampus plays a more domain-specific role in predictive processing, with the type of stimuli being predicted determining its involvement. To investigate this, we compared hippocampal responses to auditory cues predicting abstract shapes (Experiment 1) versus oriented gratings (Experiment 2). We measured brain activity in male and female human participants using high-resolution fMRI, in combination with inverted encoding models to reconstruct shape and orientation information. Our results revealed that expectations about shape and orientation evoked distinct representations in the hippocampus. For complex shapes, the hippocampus represented which shape was expected, potentially serving as a source of top–down predictions. In contrast, for simple gratings, the hippocampus represented only unexpected orientations, more reminiscent of a prediction error. We discuss several potential explanations for this content-based dissociation in hippocampal function, concluding that the computational role of the hippocampus in predictive processing may depend on the nature and complexity of stimuli.


Science ◽  
2012 ◽  
Vol 337 (6090) ◽  
pp. 109-111 ◽  
Author(s):  
R. McKell Carter ◽  
Daniel L. Bowling ◽  
Crystal Reeck ◽  
Scott A. Huettel

To make adaptive decisions in a social context, humans must identify relevant agents in the environment, infer their underlying strategies and motivations, and predict their upcoming actions. We used functional magnetic resonance imaging, in conjunction with combinatorial multivariate pattern analysis, to predict human participants’ subsequent decisions in an incentive-compatible poker game. We found that signals from the temporal-parietal junction provided unique information about the nature of the upcoming decision, and that information was specific to decisions against agents who were both social and relevant for future behavior.


2019 ◽  
Author(s):  
Sophia M. Shatek ◽  
Tijl Grootswagers ◽  
Amanda K. Robinson ◽  
Thomas A. Carlson

AbstractMental imagery is the ability to generate images in the mind in the absence of sensory input. Both perceptual visual processing and internally generated imagery engage large, overlapping networks of brain regions. However, it is unclear whether they are characterized by similar temporal dynamics. Recent magnetoencephalography work has shown that object category information was decodable from brain activity during mental imagery, but the timing was delayed relative to perception. The current study builds on these findings, using electroencephalography to investigate the dynamics of mental imagery. Sixteen participants viewed two images of the Sydney Harbour Bridge and two images of Santa Claus. On each trial, they viewed a sequence of the four images and were asked to imagine one of them, which was cued retroactively by its temporal location in the sequence. Time-resolved multivariate pattern analysis was used to decode the viewed and imagined stimuli. Our results indicate that the dynamics of imagery processes are more variable across, and within, participants compared to perception of physical stimuli. Although category and exemplar information was decodable for viewed stimuli, there were no informative patterns of activity during mental imagery. The current findings suggest stimulus complexity, task design and individual differences may influence the ability to successfully decode imagined images. We discuss the implications of these results for our understanding of the neural processes underlying mental imagery.


2015 ◽  
Vol 27 (7) ◽  
pp. 1298-1307 ◽  
Author(s):  
Yuranny Cabral-Calderin ◽  
Carsten Schmidt-Samoa ◽  
Melanie Wilke

When our brain is confronted with ambiguous visual stimuli, perception spontaneously alternates between different possible interpretations although the physical stimulus remains the same. Both alpha (8–12 Hz) and gamma (>30 Hz) oscillations have been reported to correlate with such spontaneous perceptual reversals. However, whether these oscillations play a causal role in triggering perceptual switches remains unknown. To address this question, we applied transcranial alternating current stimulation (tACS) over the posterior cortex of healthy human participants to boost alpha and gamma oscillations. At the same time, participants were reporting their percepts of an ambiguous structure-from-motion stimulus. We found that tACS in the gamma band (60 Hz) increased the number of spontaneous perceptual reversals, whereas no significant effect was found for tACS in alpha (10 Hz) and higher gamma (80 Hz) frequencies. Our results suggest a mechanistic role of gamma but not alpha oscillations in the resolution of perceptual ambiguity.


2019 ◽  
Author(s):  
Johanna Bergmann ◽  
Andrew T. Morgan ◽  
Lars Muckli

AbstractVisual illusions and visual imagery are conscious sensory events that lack a corresponding physical input. But while everyday mental imagery feels distinct from incoming stimulus input, visual illusions, like hallucinations, are under limited volitional control and appear indistinguishable from physical reality. Illusions are thought to arise from lower-level processes within sensory cortices. In contrast, imagery involves a wide network of brain areas that recruit early visual cortices for the sensory representation of the imagined stimulus. Here, we combine laminar fMRI brain imaging with psychophysical methods and multivariate pattern analysis to investigate in human participants how seemingly ‘real’ and imaginary non-physical experiences are processed in primary visual cortex (V1). We find that the content of mental imagery is only decodable in deep layers, whereas illusory content is only decodable at superficial depths. This suggests that feedback to the different layers may serve distinct functions: low-level feedback to superficial layers might be responsible for shaping perception-like experiences, while deep-layer feedback might serve the formation of a more malleable ‘inner’ world, separate from ongoing perception.


2021 ◽  
Author(s):  
Lauryn Burleigh ◽  
Xinrui Jiang ◽  
Steven G Greening

Many symptoms of anxiety and post-traumatic stress disorder are elicited by mental imagery of a conditioned stimulus (CS). Yet, little is known about how visual imagery of CSs interacts with the acquisition of differential fear conditioning. Across three experiments (n1=33, n2=27, n3=26), we observed that healthy human participants acquired differential fear conditioning to both viewed and imagined percepts serving as the conditioned stimuli as measured via self-reported fear and the skin conductance response (SCR). Additionally, this differential conditioning generalized across CS percept modalities, such that differential conditioning acquired to visual percepts generalized to the corresponding imagined percepts and vice versa. This is novel evidence that perceived and imagined stimuli engage learning processes in very similar ways and is consistent with theory that mental imagery is depictive and recruits neural resources shared with visual perception. Our findings also provide new insight into the mechanisms of anxiety and related disorders.


2019 ◽  
Vol 9 (1) ◽  
Author(s):  
Rosario Tomasello ◽  
Cora Kim ◽  
Felix R. Dreyer ◽  
Luigi Grisoni ◽  
Friedemann Pulvermüller

Abstract During everyday social interaction, gestures are a fundamental part of human communication. The communicative pragmatic role of hand gestures and their interaction with spoken language has been documented at the earliest stage of language development, in which two types of indexical gestures are most prominent: the pointing gesture for directing attention to objects and the give-me gesture for making requests. Here we study, in adult human participants, the neurophysiological signatures of gestural-linguistic acts of communicating the pragmatic intentions of naming and requesting by simultaneously presenting written words and gestures. Already at ~150 ms, brain responses diverged between naming and request actions expressed by word-gesture combination, whereas the same gestures presented in isolation elicited their earliest neurophysiological dissociations significantly later (at ~210 ms). There was an early enhancement of request-evoked brain activity as compared with naming, which was due to sources in the frontocentral cortex, consistent with access to action knowledge in request understanding. In addition, an enhanced N400-like response indicated late semantic integration of gesture-language interaction. The present study demonstrates that word-gesture combinations used to express communicative pragmatic intentions speed up the brain correlates of comprehension processes – compared with gesture-only understanding – thereby calling into question current serial linguistic models viewing pragmatic function decoding at the end of a language comprehension cascade. Instead, information about the social-interactive role of communicative acts is processed instantaneously.


2022 ◽  
Vol 119 (1) ◽  
pp. e2116616119
Author(s):  
Moritz M. Nickel ◽  
Laura Tiemann ◽  
Vanessa D. Hohn ◽  
Elisabeth S. May ◽  
Cristina Gil Ávila ◽  
...  

The perception of pain is shaped by somatosensory information about threat. However, pain is also influenced by an individual’s expectations. Such expectations can result in clinically relevant modulations and abnormalities of pain. In the brain, sensory information, expectations (predictions), and discrepancies thereof (prediction errors) are signaled by an extended network of brain areas which generate evoked potentials and oscillatory responses at different latencies and frequencies. However, a comprehensive picture of how evoked and oscillatory brain responses signal sensory information, predictions, and prediction errors in the processing of pain is lacking so far. Here, we therefore applied brief painful stimuli to 48 healthy human participants and independently modulated sensory information (stimulus intensity) and expectations of pain intensity while measuring brain activity using electroencephalography (EEG). Pain ratings confirmed that pain intensity was shaped by both sensory information and expectations. In contrast, Bayesian analyses revealed that stimulus-induced EEG responses at different latencies (the N1, N2, and P2 components) and frequencies (alpha, beta, and gamma oscillations) were shaped by sensory information but not by expectations. Expectations, however, shaped alpha and beta oscillations before the painful stimuli. These findings indicate that commonly analyzed EEG responses to painful stimuli are more involved in signaling sensory information than in signaling expectations or mismatches of sensory information and expectations. Moreover, they indicate that the effects of expectations on pain are served by brain mechanisms which differ from those conveying effects of sensory information on pain.


2021 ◽  
Author(s):  
Isaac David ◽  
Fernando A Barrios

It's now common to approach questions about information representation in the brain using multivariate statistics and machine learning methods. What is less recognized is that, in the process, the capacity for data-driven discovery and functional localization has diminished. This is because multivariate pattern analysis (MVPA) studies tend to restrict themselves to regions of interest and severely-filtered data, and sound parameter mapping inference is lacking. Here, reproducible evidence is presented that a high-dimensional, brain-wide multivariate linear method can better detect and characterize the occurrence of visual and socio-affective states in a task-oriented functional magnetic resonance imaging (fMRI) experiment; in comparison to the classical localizationist correlation analysis. Classification models for a group of human participants and existing rigorous cluster inference methods are used to construct group anatomical-statistical parametric maps, which correspond to the most likely neural correlates of each psychological state. This led to the discovery of a multidimensional pattern of brain activity which reliably encodes for the perception of happiness in the visual cortex, cerebellum and some limbic areas. We failed to find similar evidence for sadness and anger. Anatomical consistency of discriminating features across subjects and contrasts despite of the high number of dimensions, as well as agreement with the wider literature, suggest MVPA is a viable tool for full-brain functional neuroanatomical mapping and not just prediction of psychological states. The present work paves the way for future functional brain imaging studies to provide a complementary picture of brain functions (such as emotion), according to their macroscale dynamics.


Sign in / Sign up

Export Citation Format

Share Document