scholarly journals Embodied precision: Intranasal oxytocin modulates multisensory integration

2018 ◽  
Author(s):  
Laura Crucianelli ◽  
Yannis Paloyelis ◽  
Lucia Ricciardi ◽  
Paul M Jenkinson ◽  
Aikaterini Fotopoulou

AbstractMultisensory integration processes are fundamental to our sense of self as embodied beings. Bodily illusions, such as the rubber hand illusion (RHI) and the size-weight illusion (SWI), allow us to investigate how the brain resolves conflicting multisensory evidence during perceptual inference in relation to different facets of body representation. In the RHI, synchronous tactile stimulation of a participant’s hidden hand and a visible rubber hand creates illusory bodily ownership; in the SWI, the perceived size of the body can modulate the estimated weight of external objects. According to Bayesian models, such illusions arise as an attempt to explain the causes of multisensory perception and may reflect the attenuation of somatosensory precision, which is required to resolve perceptual hypotheses about conflicting multisensory input. Recent hypotheses propose that the precision or salience of sensorimotor representations is determined by modulators of synaptic gain, like dopamine, acetylcholine and oxytocin. However, these neuromodulatory hypotheses have not been tested in the context of embodied multisensory integration. The present, double-blind, placebo-controlled, crossed-over study (N = 41 healthy volunteers) aimed to investigate the effect of intranasal oxytocin (IN-OT) on multisensory integration processes, tested by means of the RHI and the SWI. Results showed that IN-OT enhanced the subjective feeling of ownership in the RHI, only when synchronous tactile stimulation was involved. Furthermore, IN-OT increased the embodied version of the SWI (quantified as weight estimation error). These findings suggest that oxytocin might modulate processes of visuo-tactile multisensory integration by increasing the precision of top-down signals against bottom-up sensory input.

2019 ◽  
Vol 31 (4) ◽  
pp. 592-606 ◽  
Author(s):  
Laura Crucianelli ◽  
Yannis Paloyelis ◽  
Lucia Ricciardi ◽  
Paul M. Jenkinson ◽  
Aikaterini Fotopoulou

Multisensory integration processes are fundamental to our sense of self as embodied beings. Bodily illusions, such as the rubber hand illusion (RHI) and the size–weight illusion (SWI), allow us to investigate how the brain resolves conflicting multisensory evidence during perceptual inference in relation to different facets of body representation. In the RHI, synchronous tactile stimulation of a participant's hidden hand and a visible rubber hand creates illusory body ownership; in the SWI, the perceived size of the body can modulate the estimated weight of external objects. According to Bayesian models, such illusions arise as an attempt to explain the causes of multisensory perception and may reflect the attenuation of somatosensory precision, which is required to resolve perceptual hypotheses about conflicting multisensory input. Recent hypotheses propose that the precision of sensorimotor representations is determined by modulators of synaptic gain, like dopamine, acetylcholine, and oxytocin. However, these neuromodulatory hypotheses have not been tested in the context of embodied multisensory integration. The present, double-blind, placebo-controlled, crossover study ( n = 41 healthy volunteers) aimed to investigate the effect of intranasal oxytocin (IN-OT) on multisensory integration processes, tested by means of the RHI and the SWI. Results showed that IN-OT enhanced the subjective feeling of ownership in the RHI, only when synchronous tactile stimulation was involved. Furthermore, IN-OT increased an embodied version of the SWI (quantified as estimation error during a weight estimation task). These findings suggest that oxytocin might modulate processes of visuotactile multisensory integration by increasing the precision of top–down signals against bottom–up sensory input.


2018 ◽  
Author(s):  
Piotr Litwin

Human body sense is surprisingly flexible – precisely administered multisensory stimulation may result in the illusion that an external object is part of one’s body. There seems to be a general consensus that there are certain top-down constraints on which objects may be incorporated: in particular, to-be-embodied objects should be structurally similar to a visual representation stored in an internal body model for a shift in one’s body image to occur. However, empirical evidence contradicts the body model hypothesis: the sense of ownership may be spread over objects strikingly distinct in morphology and structure (e.g., robotic arms or empty space) and direct empirical support for the theory is currently lacking. As an alternative, based on the example of the rubber hand illusion (RHI), I propose a multisensory integration account of how the sense of ownership is induced. In this account, the perception of one’s own body is a regular type of multisensory perception and multisensory integration processes are not only necessary but also sufficient for embodiment. In this paper, I propose how RHI can be modeled with the use of Maximum Likelihood Estimation and natural correlation rules. I also discuss how Bayesian Coupling Priors and idiosyncrasies in sensory processing render prior distributions interindividually variable, accounting for large interindividual differences in susceptibility to RHI. Taken together, the proposed model accounts for exceptional malleability of human body perception, fortifies existing bottom-up multisensory integration theories with top-down models of relatedness of sensory cues, and generates testable and disambiguating predictions.


Author(s):  
Roland Pfister ◽  
Annika L. Klaffehn ◽  
Andreas Kalckert ◽  
Wilfried Kunde ◽  
David Dignath

AbstractBody representations are readily expanded based on sensorimotor experience. A dynamic view of body representations, however, holds that these representations cannot only be expanded but that they can also be narrowed down by disembodying elements of the body representation that are no longer warranted. Here we induced illusory ownership in terms of a moving rubber hand illusion and studied the maintenance of this illusion across different conditions. We observed ownership experience to decrease gradually unless participants continued to receive confirmatory multisensory input. Moreover, a single instance of multisensory mismatch – a hammer striking the rubber hand but not the real hand – triggered substantial and immediate disembodiment. Together, these findings support and extend previous theoretical efforts to model body representations through basic mechanisms of multisensory integration. They further support an updating model suggesting that embodied entities fade from the body representation if they are not refreshed continuously.


2018 ◽  
Author(s):  
Piotr Litwin

Human body sense is surprisingly flexible – precisely administered multisensory stimulation may result in the illusion that an external object is part of one’s body. There seems to be a general consensus that there are certain top-down constraints on which objects may be incorporated: in particular, to-be-embodied objects should be structurally similar to a visual representation stored in an internal body model for a shift in one’s body image to occur. However, empirical evidence contradicts the body model hypothesis: the sense of ownership may be spread over objects strikingly distinct in morphology and structure (e.g., robotic arms or empty space) and direct empirical support for the theory is currently lacking. As an alternative, based on the example of the rubber hand illusion (RHI), I propose a multisensory integration account of how the sense of ownership is induced. In this account, the perception of one’s own body is a regular type of multisensory perception and multisensory integration processes are not only necessary but also sufficient for embodiment. In this paper, I propose how RHI can be modeled with the use of Maximum Likelihood Estimation and natural correlation rules. I also discuss how Bayesian Coupling Priors and idiosyncrasies in sensory processing render prior distributions interindividually variable, accounting for large interindividual differences in susceptibility to RHI. Taken together, the proposed model accounts for exceptional malleability of human body perception, fortifies existing bottom-up multisensory integration theories with top-down models of relatedness of sensory cues, and generates testable and disambiguating predictions.


2015 ◽  
Vol 27 (3) ◽  
pp. 573-582 ◽  
Author(s):  
Daniel Zeller ◽  
Vladimir Litvak ◽  
Karl J. Friston ◽  
Joseph Classen

The rubber hand illusion (RHI) paradigm—in which illusory bodily ownership is induced by synchronous tactile stimulation of a participant's (hidden) hand and a (visible) surrogate—allows one to investigate how the brain resolves conflicting multisensory evidence during perceptual inference. To identify the functional anatomy of the RHI, we used multichannel EEG, acquired under three conditions of tactile stimulation. Evoked potentials were averaged from EEG signals registered to the timing of brushstrokes to the participant's hand. The participant's hand was stroked either in the absence of an artificial hand (REAL) or synchronously with an artificial hand, which either lay in an anatomically plausible (CONGRUENT) or impossible (INCONGRUENT) position. The illusion was reliably elicited in the CONGRUENT condition. For right-hand stimulation, significant differences between conditions emerged at the sensor level around 55 msec after the brushstroke at left frontal and right parietal electrodes. Response amplitudes were smaller for illusory (CONGRUENT) compared with nonillusory (INCONGRUENT and REAL) conditions in the contralateral perirolandic region (pre- and postcentral gyri), superior and inferior parietal lobule, whereas veridical perception of the artificial hand (INCONGRUENT) amplified responses at a scalp region overlying the contralateral postcentral gyrus and inferior parietal lobule compared with the remaining two conditions. Left-hand stimulation produced similar contralateral patterns. These results are consistent with predictive coding models of multisensory integration and may reflect the attenuation of somatosensory precision that is required to resolve perceptual hypotheses about conflicting multisensory input.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
M. Hide ◽  
Y. Ito ◽  
N. Kuroda ◽  
M. Kanda ◽  
W. Teramoto

AbstractThis study investigates how the multisensory integration in body perception changes with increasing age, and whether it is associated with older adults’ risk of falling. For this, the rubber hand illusion (RHI) and rubber foot illusion (RFI) were used. Twenty-eight community-dwelling older adults and 25 university students were recruited. They viewed a rubber hand or foot that was stimulated in synchrony or asynchrony with their own hidden hand or foot. The illusion was assessed by using a questionnaire, and measuring the proprioceptive drift and latency. The Timed Up and Go Test was used to classify the older adults into lower and higher fall-risk groups. No difference was observed in the RHI between the younger and older adults. However, several differences were observed in the RFI. Specifically, the older adults with a lower fall-risk hardly experienced the illusion, whereas those with a higher fall-risk experienced it with a shorter latency and no weaker than the younger adults. These results suggest that in older adults, the mechanism of multisensory integration for constructing body perception can change depending on the stimulated body parts, and that the risk of falling is associated with multisensory integration.


2016 ◽  
Author(s):  
Zane Z. Zheng ◽  
Kevin G. Munhall ◽  
Ingrid S. Johnsrude

AbstractBody-schema, or the multimodal representation of one’s own body attributes, has been demonstrated previously to be malleable. In the rubber-hand illusion (Botvinick & Cohen, 1998), synchronous visual and tactile stimulation cause a fake hand to be perceived as one’s own. Similarly, if a stranger’s voice is heard synchronously with one’s own vocal production, that voice comes to be attributed to oneself (Zheng et al., 2011). Multimodal illusions like these involve distorting body schema based on correlated input, yet the degree to which different instances of distortion are perceived within the same individuals has never been examined. Here we show that participants embraced the ownership of a fake hand and a stranger’s voice to a similar degree, controlling both for individual suggestibility and for general susceptibility to illusion of body schema. Our findings suggest that the perceptual inference that leads to the distortion of body schema is a stable trait.


2016 ◽  
Vol 33 (1) ◽  
pp. 180
Author(s):  
Thiago Gomes de Castro ◽  
Marcelle Matiazo Pinhatti ◽  
Clarissa Pinto Pizarro de Freitas ◽  
William Barbosa Gomes

Research has emphasized that the body's position in space and patterns of visual searching for stimuli are crucial variables to explain the ability to estimate distances numerically. In this paper, we tested the hypothesis that proprioception recalibration interferes in the ability to numerically estimate fixed peri-personal space. The Rubber Hand Illusion (RHI) experimental paradigm was applied as a tool to temporally manipulate the sense of proprioception in participant’s right hand. Seventeen college students were asked to estimate fixed horizontal spatial cues before and after two conditions of tactile stimulation within RHI (synchronous versus asynchronous stroking). Results evidenced that proprioceptive recalibration of the hand were temporally altered by both stroking patterns. However, the effects of numerically estimate fixed horizontal cues towards the body midline were only consistently observed in the synchronous stroking condition. Those findings suggest that numerical estimates of peri-personal fixed cues are strongly associated with proprioceptive recalibration, corroborating the literature on multisensory integration of perception.


2016 ◽  
Vol 2016 ◽  
pp. 1-9 ◽  
Author(s):  
Woong Choi ◽  
Liang Li ◽  
Satoru Satoh ◽  
Kozaburo Hachimura

Improving the sense of immersion is one of the core issues in virtual reality. Perceptual illusions of ownership can be perceived over a virtual body in a multisensory virtual reality environment. Rubber Hand and Virtual Hand Illusions showed that body ownership can be manipulated by applying suitable visual and tactile stimulation. In this study, we investigate the effects of multisensory integration in the Virtual Hand Illusion with active movement. A virtual xylophone playing system which can interactively provide synchronous visual, tactile, and auditory stimulation was constructed. We conducted two experiments regarding different movement conditions and different sensory stimulations. Our results demonstrate that multisensory integration with free active movement can improve the sense of immersion in virtual reality.


2018 ◽  
Author(s):  
Fosco Bernasconi ◽  
Jean-Paul Noel ◽  
Hyeong Dong Park ◽  
Nathan Faivre ◽  
Margitta Seeck ◽  
...  

AbstractInteractions with the environment happen by the medium of the body within one’s peripersonal space (PPS) - the space surrounding the body. Studies in monkey and humans have highlighted a multisensory distributed cortical network representing the PPS. However, electrophysiological evidence for a multisensory encoding of PPS in humans is lacking. Here, we recorded for the first time intracranial electroencephalography (iEEG) in humans while administering tactile stimulation (T) on the trunk, approaching auditory stimuli (A), and the combination of the two (AT). To map PPS, in AT trials, tactile stimulation was delivered when the sound was far, at an intermediate location, or close to the body. We first identified electrodes showing AT multisensory integration (i.e., AT vs. A+T): 19% of the recording electrodes. Among those electrodes, we identified those showing a PPS effect (30% of the AT electrodes), i.e., a modulation of the evoked response to AT stimulation as a function of the distance between the sound and body. For most sites, AT multisensory integration and PPS effects had similar spatiotemporal characteristics, with an early response (~50ms) in the insular cortex, and later responses (~200ms) in pre‐ and post-central gyri. Superior temporal cortex showed a different response pattern with AT multisensory integration at ~100ms without PPS effect. These results, representing the first iEEG delineation of PPS processing in humans, show that PPS processing happens at neural sites where also multisensory integration occurs and at similar time periods, suggesting that PPS representation (around the trunk) is based on a spatial modulation of multisensory integration.


Sign in / Sign up

Export Citation Format

Share Document