scholarly journals Neural tracking of speech mental imagery during rhythmic inner counting

eLife ◽  
2019 ◽  
Vol 8 ◽  
Author(s):  
Lingxi Lu ◽  
Qian Wang ◽  
Jingwei Sheng ◽  
Zhaowei Liu ◽  
Lang Qin ◽  
...  

The subjective inner experience of mental imagery is among the most ubiquitous human experiences in daily life. Elucidating the neural implementation underpinning the dynamic construction of mental imagery is critical to understanding high-order cognitive function in the human brain. Here, we applied a frequency-tagging method to isolate the top-down process of speech mental imagery from bottom-up sensory-driven activities and concurrently tracked the neural processing time scales corresponding to the two processes in human subjects. Notably, by estimating the source of the magnetoencephalography (MEG) signals, we identified isolated brain networks activated at the imagery-rate frequency. In contrast, more extensive brain regions in the auditory temporal cortex were activated at the stimulus-rate frequency. Furthermore, intracranial stereotactic electroencephalogram (sEEG) evidence confirmed the participation of the inferior frontal gyrus in generating speech mental imagery. Our results indicate that a disassociated neural network underlies the dynamic construction of speech mental imagery independent of auditory perception.

2009 ◽  
Vol 21 (6) ◽  
pp. 1135-1145 ◽  
Author(s):  
Tali Bitan ◽  
Jimmy Cheon ◽  
Dong Lu ◽  
Douglas D. Burman ◽  
James R. Booth

We examined age-related changes in the interactions among brain regions in children performing rhyming judgments on visually presented words. The difficulty of the task was manipulated by including a conflict between task-relevant (phonological) information and task-irrelevant (orthographic) information. The conflicting conditions included pairs of words that rhyme despite having different spelling patterns (jazz–has), or words that do not rhyme despite having similar spelling patterns (pint–mint). These were contrasted with nonconflicting pairs that have similar orthography and phonology (dime–lime) or different orthography and phonology (press–list). Using fMRI, we examined effective connectivity among five left hemisphere regions of interest: fusiform gyrus (FG), inferior frontal gyrus (IFG), intraparietal sulcus (IPS), lateral temporal cortex (LTC), and medial frontal gyrus (MeFG). Age-related increases were observed in the influence of the IFG and FG on the LTC, but only in conflicting conditions. These results reflect a developmental increase in the convergence of bottom–up and top–down information on the LTC. In older children, top–down control process may selectively enhance the sensitivity of the LTC to bottom–up information from the FG. This may be evident especially in situations that require selective enhancement of task-relevant versus task-irrelevant information. Altogether these results provide a direct evidence for a developmental increase in top–down control processes in language processing. The developmental increase in bottom–up processing may be secondary to the enhancement of top–down processes.


2019 ◽  
Vol 30 (3) ◽  
pp. 875-887
Author(s):  
Kai Hwang ◽  
James M Shine ◽  
Dillan Cellier ◽  
Mark D’Esposito

Abstract Past studies have demonstrated that flexible interactions between brain regions support a wide range of goal-directed behaviors. However, the neural mechanisms that underlie adaptive communication between brain regions are not well understood. In this study, we combined theta-burst transcranial magnetic stimulation (TMS) and functional magnetic resonance imaging to investigate the sources of top-down biasing signals that influence task-evoked functional connectivity. Subjects viewed sequences of images of faces and buildings and were required to detect repetitions (2-back vs. 1-back) of the attended stimuli category (faces or buildings). We found that functional connectivity between ventral temporal cortex and the primary visual cortex (VC) increased during processing of task-relevant stimuli, especially during higher memory loads. Furthermore, the strength of functional connectivity was greater for correct trials. Increases in task-evoked functional connectivity strength were correlated with increases in activity in multiple frontal, parietal, and subcortical (caudate and thalamus) regions. Finally, we found that TMS to superior intraparietal sulcus (IPS), but not to primary somatosensory cortex, decreased task-specific modulation in connectivity patterns between the primary VC and the parahippocampal place area. These findings demonstrate that the human IPS is a source of top-down biasing signals that modulate task-evoked functional connectivity among task-relevant cortical regions.


2021 ◽  
Author(s):  
Sophie M Hardy ◽  
Ole Jensen ◽  
Linda Wheeldon ◽  
Ali Mazaheri ◽  
Katrien Segaert

Successful sentence comprehension requires the binding, or composition, of multiple words into larger structures to establish meaning. Using magnetoencephalography (MEG), we investigated the neural mechanisms involved in binding of language at the level of syntax, in a task in which contributions from semantics were minimized. Participants were auditorily presented with minimal sentences that required binding (pronoun and pseudo-verb with the corresponding morphological inflection; "she grushes") and wordlists that did not require binding (two pseudo-verbs; "cugged grushes"). Relative to the no binding wordlist condition, we found that syntactic binding in a minimal sentence structure was associated with a modulation in alpha band (8-12 Hz) activity in left-lateralized brain regions. First, in the sentence condition, we observed a significantly smaller increase in alpha power around the presentation of the target word ("grushes") that required binding (-0.05s to 0.1s), which we suggest reflects an expectation of binding to occur. Second, following the presentation of the target word (around 0.15s to 0.25s), during syntactic binding we observed significantly decreased alpha phase-locking between the left inferior frontal gyrus and the left middle/inferior temporal cortex. We suggest that this results from alpha-driven cortical disinhibition serving to increase information transfer between these two brain regions and strengthen the syntax composition neural network. Together, our findings highlight that successful syntax composition is underscored by the rapid spatial-temporal activation and coordination of language-relevant brain regions, and that alpha band oscillations are critically important in controlling the allocation and transfer of the brain's resources during syntax composition.


eLife ◽  
2017 ◽  
Vol 6 ◽  
Author(s):  
Kendrick N Kay ◽  
Jason D Yeatman

The ability to read a page of text or recognize a person's face depends on category-selective visual regions in ventral temporal cortex (VTC). To understand how these regions mediate word and face recognition, it is necessary to characterize how stimuli are represented and how this representation is used in the execution of a cognitive task. Here, we show that the response of a category-selective region in VTC can be computed as the degree to which the low-level properties of the stimulus match a category template. Moreover, we show that during execution of a task, the bottom-up representation is scaled by the intraparietal sulcus (IPS), and that the level of IPS engagement reflects the cognitive demands of the task. These results provide an account of neural processing in VTC in the form of a model that addresses both bottom-up and top-down effects and quantitatively predicts VTC responses.


2021 ◽  
Vol 7 (1) ◽  
Author(s):  
Seonggyun Han ◽  
Jaehang Shin ◽  
Hyeim Jung ◽  
Jane Ryu ◽  
Habtamu Minassie ◽  
...  

AbstractsAlzheimer’s disease (AD) is a neurodegenerative disorder and is represented by complicated biological mechanisms and complexity of brain tissue. Our understanding of the complicated molecular architecture that contributes to AD progression benefits from performing comprehensive and systemic investigations with multi-layered molecular and biological data from different brain regions. Since recently different independent studies generated various omics data in different brain regions of AD patients, multi-omics data integration can be a useful resource for better comprehensive understanding of AD. Here we present a web platform, ADAS-viewer, that provides researchers with the ability to comprehensively investigate and visualize multi-omics data from multiple brain regions of AD patients. ADAS-viewer offers means to identify functional changes in transcript and exon expression (i.e., alternative splicing) along with associated genetic or epigenetic regulatory effects. Specifically, it integrates genomic, transcriptomic, methylation, and miRNA data collected from seven different brain regions (cerebellum, temporal cortex, dorsolateral prefrontal cortex, frontal pole, inferior frontal gyrus, parahippocampal gyrus, and superior temporal gyrus) across three independent cohort datasets. ADAS-viewer is particularly useful as a web-based application for analyzing and visualizing multi-omics data across multiple brain regions at both transcript and exon level, allowing the identification of candidate biomarkers of Alzheimer’s disease.


Author(s):  
Angela D. Friederici ◽  
Noam Chomsky

This chapter reviews the neural underpinning of normal language acquisition and asks not only at which age certain milestones in language acquisition are achieved, but moreover to what extent is this achievement dependent on the maturation of particular brain structures. In our recent model, the neural basis of the developing language system is described to reflect two major phases. The available data provide consistent evidence that very early on an infant is able to extract language-relevant information from the acoustic input. This first phase covers the first three years of life when language processing is largely input-driven and supported by the temporal cortex and the ventral part of the network. A second phase extends beyond age 3, when top-down processes come into play, and the left inferior frontal cortex and the dorsal part of the language network are recruited to a larger extent. Development towards full language performance beyond age 3 is dependent on maturational changes in the gray and white matter. An increased language ability is correlated with an increase in structural and functional connectivity between language-related brain regions in the left hemisphere, the inferior frontal gyrus and the posterior superior temporal gyrus/superior temporal sulcus.


2016 ◽  
Author(s):  
Kendrick N. Kay ◽  
Jason D. Yeatman

SummaryThe ability to read a page of text or recognize a person’s face depends on category-selective visual regions in ventral temporal cortex (VTC). To understand how these regions mediate word and face recognition, it is necessary to characterize how stimuli are represented and how this representation is used in the execution of a cognitive task. Here, we show that the response of a category-selective region in VTC can be computed as the degree to which the low-level properties of the stimulus match a category template. Moreover, we show that during execution of a task, the bottom-up representation is scaled by the intraparietal sulcus (IPS), and that the level of IPS engagement reflects the cognitive demands of the task. These results provide a unifying account of neural processing in VTC in the form of a model that addresses both bottom-up and top-down effects and quantitatively predicts VTC responses.


2018 ◽  
Vol 29 (8) ◽  
pp. 3232-3240 ◽  
Author(s):  
Jingwei Sheng ◽  
Li Zheng ◽  
Bingjiang Lyu ◽  
Zhehang Cen ◽  
Lang Qin ◽  
...  

AbstractThe hierarchical nature of language requires human brain to internally parse connected-speech and incrementally construct abstract linguistic structures. Recent research revealed multiple neural processing timescales underlying grammar-based configuration of linguistic hierarchies. However, little is known about where in the whole cerebral cortex such temporally scaled neural processes occur. This study used novel magnetoencephalography source imaging techniques combined with a unique language stimulation paradigm to segregate cortical maps synchronized to 3 levels of linguistic units (i.e., words, phrases, and sentences). Notably, distinct ensembles of cortical loci were identified to feature structures at different levels. The superior temporal gyrus was found to be involved in processing all 3 linguistic levels while distinct ensembles of other brain regions were recruited to encode each linguistic level. Neural activities in the right motor cortex only followed the rhythm of monosyllabic words which have clear acoustic boundaries, whereas the left anterior temporal lobe and the left inferior frontal gyrus were selectively recruited in processing phrases or sentences. Our results ground a multi-timescale hierarchical neural processing of speech in neuroanatomical reality with specific sets of cortices responsible for different levels of linguistic units.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Ben Somers ◽  
Christopher J. Long ◽  
Tom Francart

AbstractThe cochlear implant is one of the most successful medical prostheses, allowing deaf and severely hearing-impaired persons to hear again by electrically stimulating the auditory nerve. A trained audiologist adjusts the stimulation settings for good speech understanding, known as “fitting” the implant. This process is based on subjective feedback from the user, making it time-consuming and challenging, especially in paediatric or communication-impaired populations. Furthermore, fittings only happen during infrequent sessions at a clinic, and therefore cannot take into account variable factors that affect the user’s hearing, such as physiological changes and different listening environments. Objective audiometry, in which brain responses evoked by auditory stimulation are collected and analysed, removes the need for active patient participation. However, recording of brain responses still requires expensive equipment that is cumbersome to use. An elegant solution is to record the neural signals using the implant itself. We demonstrate for the first time the recording of continuous electroencephalographic (EEG) signals from the implanted intracochlear electrode array in human subjects, using auditory evoked potentials originating from different brain regions. This was done using a temporary recording set-up with a percutaneous connector used for research purposes. Furthermore, we show that the response morphologies and amplitudes depend crucially on the recording electrode configuration. The integration of an EEG system into cochlear implants paves the way towards chronic neuro-monitoring of hearing-impaired patients in their everyday environment, and neuro-steered hearing prostheses, which can autonomously adjust their output based on neural feedback.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Ryo Kitada ◽  
Jinhwan Kwon ◽  
Ryuichi Doizaki ◽  
Eri Nakagawa ◽  
Tsubasa Tanigawa ◽  
...  

AbstractUnlike the assumption of modern linguistics, there is non-arbitrary association between sound and meaning in sound symbolic words. Neuroimaging studies have suggested the unique contribution of the superior temporal sulcus to the processing of sound symbolism. However, because these findings are limited to the mapping between sound symbolism and visually presented objects, the processing of sound symbolic information may also involve the sensory-modality dependent mechanisms. Here, we conducted a functional magnetic resonance imaging experiment to test whether the brain regions engaged in the tactile processing of object properties are also involved in mapping sound symbolic information with tactually perceived object properties. Thirty-two healthy subjects conducted a matching task in which they judged the congruency between softness perceived by touch and softness associated with sound symbolic words. Congruency effect was observed in the orbitofrontal cortex, inferior frontal gyrus, insula, medial superior frontal gyrus, cingulate gyrus, and cerebellum. This effect in the insula and medial superior frontal gyri was overlapped with softness-related activity that was separately measured in the same subjects in the tactile experiment. These results indicate that the insula and medial superior frontal gyrus play a role in processing sound symbolic information and relating it to the tactile softness information.


Sign in / Sign up

Export Citation Format

Share Document