scholarly journals Transcranial magnetic stimulation during British Sign Language production reveals monitoring of discrete linguistic units in left superior parietal lobule

2019 ◽  
Author(s):  
David Vinson ◽  
Neil Fox ◽  
Joseph T. Devlin ◽  
Karen Emmorey ◽  
Gabriella Vigliocco

AbstractSuccessful human hand and arm movements are typically carried out by combining visual, motoric, and proprioceptive information in planning, initiation, prediction, and control. The superior parietal lobule (SPL) has been argued to play a key role in integrating visual and motoric information particularly during grasping of objects and other such tasks which prioritise visual information. However, sign language production also engages SPL even though fluent signers do not visually track their hands or fixate on target locations. Does sign language production simply rely on the motoric/ proprioceptive processes engaged in visually guided action, or do the unique characteristics of signed languages change these processes? Fifteen fluent British Sign Language users named pictures while we administered transcranial magnetic stimulation (TMS) to left SPL, a control site, or no TMS. TMS to SPL had very specific effects: an increased rate of (sign-based) phonological substitution errors for complex two-handed signs (those requiring hand contact), but TMS did not slow or otherwise impair performance. Thus, TMS decreased the likelihood of detecting or correcting phonological errors during otherwise successful bimanual coordination, but it did not noticeably alter fine movement control. These findings confirm that for fluent signers SPL has adapted to monitor motor plans for discrete hand configurations retrieved from memory as well as more fine-grained aspects of visually guided actions.

2008 ◽  
Author(s):  
David P. Vinson ◽  
Kearsy Cormier ◽  
Tanya Denmark ◽  
Adam Schembri ◽  
Gabriella Vigliocco

2021 ◽  
Vol 11 (2) ◽  
pp. 252
Author(s):  
Fabiano Botta ◽  
Juan Lupiáñez ◽  
Valerio Santangelo ◽  
Elisa Martín-Arévalo

Several studies have shown enhanced performance in change detection tasks when spatial cues indicating the probe’s location are presented after the memory array has disappeared (i.e., retro-cues) compared with spatial cues that are presented simultaneously with the test array (i.e., post-cues). This retro-cue benefit led some authors to propose the existence of two different stores of visual short-term memory: a weak but high-capacity store (fragile memory (FM)) linked to the effect of retro-cues and a robust but low-capacity store (working memory (WM)) linked to the effect of post-cues. The former is thought to be an attention-free system, whereas the latter would strictly depend on selective attention. Nonetheless, this dissociation is under debate, and several authors do not consider retro-cues as a proxy to measure the existence of an independent memory system (e.g., FM). We approached this controversial issue by altering the attention-related functions in the right superior parietal lobule (SPL) by transcranial magnetic stimulation (TMS), whose effects were mediated by the integrity of the right superior longitudinal fasciculus (SLF). Specifically, we asked whether TMS on the SPL affected the performance of retro cues vs. post-cues to a similar extent. The results showed that TMS on the SPL, mediated by right SLF-III integrity, produced a modulation of the retro-cue benefit, namely a memory capacity decrease in the post-cues but not in the retro-cues. These findings have strong implications for the debate on the existence of independent stages of visual short-term memory and for the growing literature showing a key role of the SLF for explaining the variability of TMS effects across participants.


2020 ◽  
Vol 37 (4) ◽  
pp. 571-608
Author(s):  
Diane Brentari ◽  
Laura Horton ◽  
Susan Goldin-Meadow

Abstract Two differences between signed and spoken languages that have been widely discussed in the literature are: the degree to which morphology is expressed simultaneously (rather than sequentially), and the degree to which iconicity is used, particularly in predicates of motion and location, often referred to as classifier predicates. In this paper we analyze a set of properties marking agency and number in four sign languages for their crosslinguistic similarities and differences regarding simultaneity and iconicity. Data from American Sign Language (ASL), Italian Sign Language (LIS), British Sign Language (BSL), and Hong Kong Sign Language (HKSL) are analyzed. We find that iconic, cognitive, phonological, and morphological factors contribute to the distribution of these properties. We conduct two analyses—one of verbs and one of verb phrases. The analysis of classifier verbs shows that, as expected, all four languages exhibit many common formal and iconic properties in the expression of agency and number. The analysis of classifier verb phrases (VPs)—particularly, multiple-verb predicates—reveals (a) that it is grammatical in all four languages to express agency and number within a single verb, but also (b) that there is crosslinguistic variation in expressing agency and number across the four languages. We argue that this variation is motivated by how each language prioritizes, or ranks, several constraints. The rankings can be captured in Optimality Theory. Some constraints in this account, such as a constraint to be redundant, are found in all information systems and might be considered non-linguistic; however, the variation in constraint ranking in verb phrases reveals the grammatical and arbitrary nature of linguistic systems.


Cortex ◽  
2021 ◽  
Vol 135 ◽  
pp. 240-254
Author(s):  
A. Banaszkiewicz ◽  
Ł. Bola ◽  
J. Matuszewski ◽  
M. Szczepanik ◽  
B. Kossowski ◽  
...  

1982 ◽  
Vol 1031 (1) ◽  
pp. 155-178
Author(s):  
James G. Kyle ◽  
Bencie Woll ◽  
Peter Llewellyn-Jones

2016 ◽  
Vol 28 (1) ◽  
pp. 20-40 ◽  
Author(s):  
Velia Cardin ◽  
Eleni Orfanidou ◽  
Lena Kästner ◽  
Jerker Rönnberg ◽  
Bencie Woll ◽  
...  

The study of signed languages allows the dissociation of sensorimotor and cognitive neural components of the language signal. Here we investigated the neurocognitive processes underlying the monitoring of two phonological parameters of sign languages: handshape and location. Our goal was to determine if brain regions processing sensorimotor characteristics of different phonological parameters of sign languages were also involved in phonological processing, with their activity being modulated by the linguistic content of manual actions. We conducted an fMRI experiment using manual actions varying in phonological structure and semantics: (1) signs of a familiar sign language (British Sign Language), (2) signs of an unfamiliar sign language (Swedish Sign Language), and (3) invented nonsigns that violate the phonological rules of British Sign Language and Swedish Sign Language or consist of nonoccurring combinations of phonological parameters. Three groups of participants were tested: deaf native signers, deaf nonsigners, and hearing nonsigners. Results show that the linguistic processing of different phonological parameters of sign language is independent of the sensorimotor characteristics of the language signal. Handshape and location were processed by different perceptual and task-related brain networks but recruited the same language areas. The semantic content of the stimuli did not influence this process, but phonological structure did, with nonsigns being associated with longer RTs and stronger activations in an action observation network in all participants and in the supramarginal gyrus exclusively in deaf signers. These results suggest higher processing demands for stimuli that contravene the phonological rules of a signed language, independently of previous knowledge of signed languages. We suggest that the phonological characteristics of a language may arise as a consequence of more efficient neural processing for its perception and production.


Sign in / Sign up

Export Citation Format

Share Document