scholarly journals Unconscious integration of multisensory bodily inputs in the peripersonal space shapes bodily self-consciousness

2016 ◽  
Author(s):  
Roy Salomon ◽  
Jean-Paul Noel ◽  
Marta Łukowska ◽  
Nathan Faivre ◽  
Thomas Metzinger ◽  
...  

AbstractRecent studies have highlighted the role of multisensory integration as a key mechanism of self-consciousness. In particular, integration of bodily signals within the peripersonal space (PPS) underlies the experience of the self in a body we own (self-identification) and that is experienced as occupying a specific location in space (self-location), two main components of bodily self-consciousness (BSC). Experiments investigating the effects of multisensory integration on BSC have typically employed supra-threshold sensory stimuli, neglecting the role of unconscious sensory signals in BSC, as tested in other consciousness research. Here, we used psychophysical techniques to test whether multisensory integration of bodily stimuli underlying BSC may also occur for multisensory inputs presented below the threshold of conscious perception. Our results indicate that visual stimuli rendered invisible (through continuous flash suppression) boost processing of tactile stimuli on the body (Exp. 1), and enhance the perception of near-threshold tactile stimuli (Exp. 2), only once they entered peripersonal space. We then employed unconscious multisensory mechanisms to manipulate BSC. Participants were presented with tactile stimulation on their body and with visual stimuli on a virtual body, seen at a distance, which were either visible or rendered invisible. We report that if visuo-tactile stimulation was synchronized, participants self-identified with the virtual body (Exp. 3), and shifted their self-location toward the virtual body (Exp.4), even if visual stimuli were fully invisible. Our results indicate that multisensory inputs, even outside of awareness, are integrated and affect the phenomenological content of self-consciousness, grounding BSC firmly in the field of psychophysical consciousness studies.

2019 ◽  
Vol 31 (8) ◽  
pp. 1155-1172 ◽  
Author(s):  
Jean-Paul Noel ◽  
Andrea Serino ◽  
Mark T. Wallace

The actionable space surrounding the body, referred to as peripersonal space (PPS), has been the subject of significant interest of late within the broader framework of embodied cognition. Neurophysiological and neuroimaging studies have shown the representation of PPS to be built from visuotactile and audiotactile neurons within a frontoparietal network and whose activity is modulated by the presence of stimuli in proximity to the body. In contrast to single-unit and fMRI studies, an area of inquiry that has received little attention is the EEG characterization associated with PPS processing. Furthermore, although PPS is encoded by multisensory neurons, to date there has been no EEG study systematically examining neural responses to unisensory and multisensory stimuli, as these are presented outside, near, and within the boundary of PPS. Similarly, it remains poorly understood whether multisensory integration is generally more likely at certain spatial locations (e.g., near the body) or whether the cross-modal tactile facilitation that occurs within PPS is simply due to a reduction in the distance between sensory stimuli when close to the body and in line with the spatial principle of multisensory integration. In the current study, to examine the neural dynamics of multisensory processing within and beyond the PPS boundary, we present auditory, visual, and audiovisual stimuli at various distances relative to participants' reaching limit—an approximation of PPS—while recording continuous high-density EEG. We question whether multisensory (vs. unisensory) processing varies as a function of stimulus–observer distance. Results demonstrate a significant increase of global field power (i.e., overall strength of response across the entire electrode montage) for stimuli presented at the PPS boundary—an increase that is largest under multisensory (i.e., audiovisual) conditions. Source localization of the major contributors to this global field power difference suggests neural generators in the intraparietal sulcus and insular cortex, hubs for visuotactile and audiotactile PPS processing. Furthermore, when neural dynamics are examined in more detail, changes in the reliability of evoked potentials in centroparietal electrodes are predictive on a subject-by-subject basis of the later changes in estimated current strength at the intraparietal sulcus linked to stimulus proximity to the PPS boundary. Together, these results provide a previously unrealized view into the neural dynamics and temporal code associated with the encoding of nontactile multisensory around the PPS boundary.


2018 ◽  
Author(s):  
Fosco Bernasconi ◽  
Jean-Paul Noel ◽  
Hyeong Dong Park ◽  
Nathan Faivre ◽  
Margitta Seeck ◽  
...  

AbstractInteractions with the environment happen by the medium of the body within one’s peripersonal space (PPS) - the space surrounding the body. Studies in monkey and humans have highlighted a multisensory distributed cortical network representing the PPS. However, electrophysiological evidence for a multisensory encoding of PPS in humans is lacking. Here, we recorded for the first time intracranial electroencephalography (iEEG) in humans while administering tactile stimulation (T) on the trunk, approaching auditory stimuli (A), and the combination of the two (AT). To map PPS, in AT trials, tactile stimulation was delivered when the sound was far, at an intermediate location, or close to the body. We first identified electrodes showing AT multisensory integration (i.e., AT vs. A+T): 19% of the recording electrodes. Among those electrodes, we identified those showing a PPS effect (30% of the AT electrodes), i.e., a modulation of the evoked response to AT stimulation as a function of the distance between the sound and body. For most sites, AT multisensory integration and PPS effects had similar spatiotemporal characteristics, with an early response (~50ms) in the insular cortex, and later responses (~200ms) in pre‐ and post-central gyri. Superior temporal cortex showed a different response pattern with AT multisensory integration at ~100ms without PPS effect. These results, representing the first iEEG delineation of PPS processing in humans, show that PPS processing happens at neural sites where also multisensory integration occurs and at similar time periods, suggesting that PPS representation (around the trunk) is based on a spatial modulation of multisensory integration.


2002 ◽  
Vol 13 (4) ◽  
pp. 350-355 ◽  
Author(s):  
Angelo Maravita ◽  
Charles Spence ◽  
Claire Sergent ◽  
Jon Driver

In mirror reflections, visual stimuli in near peripersonal space (e.g., an object in the hand) can project the retinal image of far, extrapersonal stimuli “beyond” the mirror. We studied the interaction of such visual reflections with tactile stimuli in a cross-modal congruency task. We found that visual distractors produce stronger interference on tactile judgments when placed close to the stimulated hand, but observed indirectly as distant mirror reflections, than when directly observed in equivalently distant far space, even when in contact with a dummy hand or someone else's hand in the far location. The stronger visual-tactile interference for the mirror condition implies that near stimuli seen as distant reflections in a mirror view of one's own hands can activate neural networks coding peripersonal space, because these visual stimuli are coded as having a true source near to the body.


1977 ◽  
Vol 66 (1) ◽  
pp. 203-219
Author(s):  
W. J. Heitler ◽  
M. Burrows

A motor programme is described for defensive kicking in the locust which is also probably the programme for jumping. The method of analysis has been to make intracellular recordings from the somata of identified motornuerones which control the metathoracic tibiae while defensive kicks are made in response to tactile stimuli. Three stages are recognized in the programme. (1) Initial flexion of the tibiae results from the low spike threshold of tibial flexor motorneurones to tactile stimulation of the body. (2) Co-contraction of flexor and extensor muscles followa in which flexor and extensor excitor motoneurones spike at high frequency for 300-600 ms. the tibia flexed while the extensor muscle develops tension isometrically to the level required for a kick or jump. (3) Trigger activity terminates the co-contraction by inhibiting the flexor excitor motorneurones and simultaneously exciting the flexor inhibitors. This causes relaxation of the flexor muscle and allows the tibiae to extend. If the trigger activity does not occur, the jump or kick is aborted, and the tibiae remain flexed.


Author(s):  
Viacheslav Stepanenko ◽  

The article defines the main components of the formation of the ability to combine various technological methods of laboratory research as a special competence that has to be mastered by students of the speciality “Technologies of Medical Diagnostics and Treatment” in the process of training. The combination is presented as the implementation of an action or a series of actions aimed at transforming the existing set of objects into a system that meets the requirements of the task. Its scientific and theoretical, and practical blocks are revealed. It is noted that the scientific and theoretical block of combination consists of scientific approaches and principles, complex combined research methods, and the practical block embraces various methods, techniques and laboratory research technologies. Attention is drawn to the fact that in the process of training of future laboratory assistants it is important to form their understanding that when diagnosing certain diseases of the body one cannot be limited to only one method, but it is necessary to combine various methods and research techniques. The role of synchronous and asynchronous forms of organization of training in the formation of the ability of students of the speciality “Technologies of Medical Diagnostics and Treatment” to combine various laboratory research technological methods is indicated. Examples of combining various technological methods of laboratory research are given.


2018 ◽  
Author(s):  
Laura Crucianelli ◽  
Yannis Paloyelis ◽  
Lucia Ricciardi ◽  
Paul M Jenkinson ◽  
Aikaterini Fotopoulou

AbstractMultisensory integration processes are fundamental to our sense of self as embodied beings. Bodily illusions, such as the rubber hand illusion (RHI) and the size-weight illusion (SWI), allow us to investigate how the brain resolves conflicting multisensory evidence during perceptual inference in relation to different facets of body representation. In the RHI, synchronous tactile stimulation of a participant’s hidden hand and a visible rubber hand creates illusory bodily ownership; in the SWI, the perceived size of the body can modulate the estimated weight of external objects. According to Bayesian models, such illusions arise as an attempt to explain the causes of multisensory perception and may reflect the attenuation of somatosensory precision, which is required to resolve perceptual hypotheses about conflicting multisensory input. Recent hypotheses propose that the precision or salience of sensorimotor representations is determined by modulators of synaptic gain, like dopamine, acetylcholine and oxytocin. However, these neuromodulatory hypotheses have not been tested in the context of embodied multisensory integration. The present, double-blind, placebo-controlled, crossed-over study (N = 41 healthy volunteers) aimed to investigate the effect of intranasal oxytocin (IN-OT) on multisensory integration processes, tested by means of the RHI and the SWI. Results showed that IN-OT enhanced the subjective feeling of ownership in the RHI, only when synchronous tactile stimulation was involved. Furthermore, IN-OT increased the embodied version of the SWI (quantified as weight estimation error). These findings suggest that oxytocin might modulate processes of visuo-tactile multisensory integration by increasing the precision of top-down signals against bottom-up sensory input.


2020 ◽  
Vol 14 ◽  
Author(s):  
Daniela Rabellino ◽  
Paul A. Frewen ◽  
Margaret C. McKinnon ◽  
Ruth A. Lanius

Peripersonal space (PPS) is defined as the space surrounding the body where we can reach or be reached by external entities, including objects or other individuals. PPS is an essential component of bodily self-consciousness that allows us to perform actions in the world (e.g., grasping and manipulating objects) and protect our body while interacting with the surrounding environment. Multisensory processing plays a critical role in PPS representation, facilitating not only to situate ourselves in space but also assisting in the localization of external entities at a close distance from our bodies. Such abilities appear especially crucial when an external entity (a sound, an object, or a person) is approaching us, thereby allowing the assessment of the salience of a potential incoming threat. Accordingly, PPS represents a key aspect of social cognitive processes operational when we interact with other people (for example, in a dynamic dyad). The underpinnings of PPS have been investigated largely in human models and in animals and include the operation of dedicated multimodal neurons (neurons that respond specifically to co-occurring stimuli from different perceptive modalities, e.g., auditory and tactile stimuli) within brain regions involved in sensorimotor processing (ventral intraparietal sulcus, ventral premotor cortex), interoception (insula), and visual recognition (lateral occipital cortex). Although the defensive role of the PPS has been observed in psychopathology (e.g., in phobias) the relation between PPS and altered states of bodily consciousness remains largely unexplored. Specifically, PPS representation in trauma-related disorders, where altered states of consciousness can involve dissociation from the body and its surroundings, have not been investigated. Accordingly, we review here: (1) the behavioral and neurobiological literature surrounding trauma-related disorders and its relevance to PPS; and (2) outline future research directions aimed at examining altered states of bodily self-consciousness in trauma related-disorders.


2020 ◽  
Vol 238 (12) ◽  
pp. 2865-2875
Author(s):  
Fabrizio Leo ◽  
Sara Nataletti ◽  
Luca Brayda

Abstract Vision of the body has been reported to improve tactile acuity even when vision is not informative about the actual tactile stimulation. However, it is currently unclear whether this effect is limited to body parts such as hand, forearm or foot that can be normally viewed, or it also generalizes to body locations, such as the shoulder, that are rarely before our own eyes. In this study, subjects consecutively performed a detection threshold task and a numerosity judgment task of tactile stimuli on the shoulder. Meanwhile, they watched either a real-time video showing their shoulder or simply a fixation cross as control condition. We show that non-informative vision improves tactile numerosity judgment which might involve tactile acuity, but not tactile sensitivity. Furthermore, the improvement in tactile accuracy modulated by vision seems to be due to an enhanced ability in discriminating the number of adjacent active electrodes. These results are consistent with the view that bimodal visuotactile neurons sharp tactile receptive fields in an early somatosensory map, probably via top-down modulation of lateral inhibition.


2019 ◽  
Author(s):  
Klaudia Grechuta ◽  
Javier De La Torre ◽  
Belén Rubio Ballester ◽  
Paul F.M.J. Verschure

AbstractThe unique ability to identify one’s own body and experience it as one’s own is fundamental in goal-oriented behavior and survival. However, the mechanisms underlying the so-called body ownership are yet not fully understood. The plasticity of body ownership has been studied using two experimental methods or their variations. Specifically, the Rubber Hand Illusion (RHI), where the tactile stimuli are externally generated, or the moving RHI which implies self-initiated movements. Grounded in these paradigms, evidence has demonstrated that body ownership is a product of bottom-up reception of self- and externally-generated multisensory information and top-down comparison between the predicted and the actual sensory stimuli. Crucially, provided the design of the current paradigms, where one of the manipulated cues always involves the processing of a proximal modality sensing the body or its surface (e.g., touch), the contribution of sensory signals which pertain to the environment remain elusive. Here we propose that, as any robust percept, body ownership depends on the integration and prediction of all the sensory stimuli, and therefore it will depend on the consistency of purely distal sensory signals pertaining to the environment. To test our hypothesis, we create an embodied goal-oriented task and manipulate the predictability of the surrounding environment by changing the congruency of purely distal multisensory cues while preserving bodily and action-driven signals entirely predictable. Our results empirically reveal that the way we represent our body is contingent upon all the sensory stimuli including purely distal and action-independent signals which pertain to the environment.


2019 ◽  
Vol 30 (10) ◽  
pp. 1522-1532 ◽  
Author(s):  
Tomohiro Amemiya ◽  
Yasushi Ikei ◽  
Michiteru Kitazaki

The limited space immediately surrounding our body, known as peripersonal space (PPS), has been investigated by focusing on changes in the multisensory processing of audio-tactile stimuli occurring within or outside the PPS. Some studies have reported that the PPS representation is extended by body actions such as walking. However, it is unclear whether the PPS changes when a walking-like sensation is induced but the body neither moves nor is forced to move. Here, we show that a rhythmic pattern consisting of walking-sound vibrations applied to the soles of the feet, but not the forearms, boosted tactile processing when looming sounds were located near the body. The findings suggest that an extension of the PPS representation can be triggered by stimulating the soles in the absence of body action, which may automatically drive a motor program for walking, leading to a change in spatial cognition around the body.


Sign in / Sign up

Export Citation Format

Share Document