scholarly journals Vestibular modulation of multisensory integration during actual and vicarious tactile stimulation

2019 ◽  
Author(s):  
Sonia Ponzo ◽  
Louise P. Kirsch ◽  
Aikaterini Fotopoulou ◽  
Paul M. Jenkinson

AbstractBackgroundThe vestibular system has been shown to contribute to multisensory integration by balancing conflictual sensory information. It remains unclear whether such modulation of exteroceptive (e.g. vision), proprioceptive and interoceptive (e.g. affective touch) sensory sources is influenced by epistemically different aspects of tactile stimulation (i.e. felt from within vs seen, vicarious touch).ObjectiveWe aimed to i) replicate previous findings regarding the effects of galvanic stimulation of the right vestibular network (i.e. LGVS) in multisensory integration and ii) examine vestibular contributions to multisensory integration when touch is felt but not seen (and vice-versa).MethodDuring artificial vestibular stimulation (LGVS, RGVS and Sham), healthy participants (N=36, Experiment 1; N=37, Experiment 2) looked at a rubber hand while either their own unseen hand or the rubber hand were touched by affective or neutral touch.ResultsWe found that i) LGVS led to enhancement of vision over proprioception during visual only conditions (replicating our previous findings), and ii) LGVS (vs Sham) favoured proprioception over vision when touch was felt (Experiment 1), with the opposite results when touch was vicariously perceived via vision (Experiment 2), and with no difference between affective and neutral touch.ConclusionsWe showed how vestibular signals modulate the weight of each sensory modality according to the context in which they are perceived and that such modulation extends to different aspects of tactile stimulation: felt and seen touch are differentially balanced in multisensory integration according to their epistemic relevance.HighlightsLGVS increased proprioceptive drift during vision of a rubber handTouch on participant’s hand decreased proprioceptive drift during LGVSVicarious touch on the Rubber Hand increased proprioceptive drift during LGVSVestibular signals differently balance sensory sources in multisensory integration

2018 ◽  
Author(s):  
Charlotte Rae ◽  
Dennis Larsson ◽  
Jessica Eccles ◽  
Jamie Ward ◽  
HUgo Critchley

The rubber hand illusion describes a sense of embodiment over a fake hand induced by synchronous visuo-tactile stimulation. In Tourette Syndrome, the expression of involuntary tics and preceding premonitory sensations is associated with the perturbation of subjective feelings of self-control and agency. We compared responses to induction of the Rubber Hand Illusion in 23 adults with TS and 22 matched controls. Both TS and control participants reported equivalent subjective embodiment of the artificial hand: feelings of ownership, location, and agency were greater during synchronous visuo-tactile stimulation, compared to asynchronous stimulation. However, individuals with TS did not manifest greater proprioceptive drift during synchronous relative to asynchronous stimulation, an objective marker of embodiment observed in controls. We computed an ‘embodiment prediction error’ index from the difference between subjective embodiment and objective proprioceptive drift. This embodiment prediction error correlated with severity of premonitory sensations according to the Premonitory Urge for Tics Scale (PUTS). Feelings of ownership over the artificial hand also correlated with premonitory sensation severity, and feelings of agency with tic severity (YGTSS). Together our findings suggest that the subjective strength of bodily ownership, as measured by the rubber hand illusion, contributes to susceptibility to the premonitory sensations that are a precipitating factor in tics. These results also suggest that somatosensory neural pathways underpinning visuo-tactile integration are likely altered in TS and may interact with other sensory and motor systems to engender premonitory sensations and tics.


2018 ◽  
Author(s):  
Laura Crucianelli ◽  
Yannis Paloyelis ◽  
Lucia Ricciardi ◽  
Paul M Jenkinson ◽  
Aikaterini Fotopoulou

AbstractMultisensory integration processes are fundamental to our sense of self as embodied beings. Bodily illusions, such as the rubber hand illusion (RHI) and the size-weight illusion (SWI), allow us to investigate how the brain resolves conflicting multisensory evidence during perceptual inference in relation to different facets of body representation. In the RHI, synchronous tactile stimulation of a participant’s hidden hand and a visible rubber hand creates illusory bodily ownership; in the SWI, the perceived size of the body can modulate the estimated weight of external objects. According to Bayesian models, such illusions arise as an attempt to explain the causes of multisensory perception and may reflect the attenuation of somatosensory precision, which is required to resolve perceptual hypotheses about conflicting multisensory input. Recent hypotheses propose that the precision or salience of sensorimotor representations is determined by modulators of synaptic gain, like dopamine, acetylcholine and oxytocin. However, these neuromodulatory hypotheses have not been tested in the context of embodied multisensory integration. The present, double-blind, placebo-controlled, crossed-over study (N = 41 healthy volunteers) aimed to investigate the effect of intranasal oxytocin (IN-OT) on multisensory integration processes, tested by means of the RHI and the SWI. Results showed that IN-OT enhanced the subjective feeling of ownership in the RHI, only when synchronous tactile stimulation was involved. Furthermore, IN-OT increased the embodied version of the SWI (quantified as weight estimation error). These findings suggest that oxytocin might modulate processes of visuo-tactile multisensory integration by increasing the precision of top-down signals against bottom-up sensory input.


2020 ◽  
Vol 82 (8) ◽  
pp. 4084-4095
Author(s):  
Roberto Erro ◽  
Angela Marotta ◽  
Mirta Fiorio

Abstract In the rubber hand illusion (RHI), simultaneous brush stroking of a subject’s hidden hand and a visible rubber hand induces a transient illusion of the latter to “feel like it’s my hand” and a proprioceptive drift of the hidden own hand toward the rubber hand. Recent accounts of the RHI have suggested that the illusion would only occur if weighting of conflicting sensory information and their subsequent integration results in a statistically plausible compromise. In three different experiments, we investigated the role of distance between the two hands as well as their proximity to the body’s midline in influencing the occurrence of the illusion. Overall, the results suggest that the illusion is abolished when placing the two hands apart, therefore increasing the mismatch between the visual and proprioceptive modality, whereas the proximity of the two hands to the body’s midline plays only a minor role on the subjective report of the illusion. This might be driven by the response properties of visuotactile bimodal cells encoding the peripersonal space around the hand.


2019 ◽  
Vol 31 (4) ◽  
pp. 592-606 ◽  
Author(s):  
Laura Crucianelli ◽  
Yannis Paloyelis ◽  
Lucia Ricciardi ◽  
Paul M. Jenkinson ◽  
Aikaterini Fotopoulou

Multisensory integration processes are fundamental to our sense of self as embodied beings. Bodily illusions, such as the rubber hand illusion (RHI) and the size–weight illusion (SWI), allow us to investigate how the brain resolves conflicting multisensory evidence during perceptual inference in relation to different facets of body representation. In the RHI, synchronous tactile stimulation of a participant's hidden hand and a visible rubber hand creates illusory body ownership; in the SWI, the perceived size of the body can modulate the estimated weight of external objects. According to Bayesian models, such illusions arise as an attempt to explain the causes of multisensory perception and may reflect the attenuation of somatosensory precision, which is required to resolve perceptual hypotheses about conflicting multisensory input. Recent hypotheses propose that the precision of sensorimotor representations is determined by modulators of synaptic gain, like dopamine, acetylcholine, and oxytocin. However, these neuromodulatory hypotheses have not been tested in the context of embodied multisensory integration. The present, double-blind, placebo-controlled, crossover study ( n = 41 healthy volunteers) aimed to investigate the effect of intranasal oxytocin (IN-OT) on multisensory integration processes, tested by means of the RHI and the SWI. Results showed that IN-OT enhanced the subjective feeling of ownership in the RHI, only when synchronous tactile stimulation was involved. Furthermore, IN-OT increased an embodied version of the SWI (quantified as estimation error during a weight estimation task). These findings suggest that oxytocin might modulate processes of visuotactile multisensory integration by increasing the precision of top–down signals against bottom–up sensory input.


2017 ◽  
Vol 30 (7-8) ◽  
pp. 615-637 ◽  
Author(s):  
Olga Perepelkina ◽  
Maria Boboleva ◽  
Galina Arina ◽  
Valentina Nikolaeva

The aim of the study was to investigate how emotion information processing factors, such as alexithymia and emotional intelligence, modulate body ownership and influence multisensory integration during the ‘rubber hand illusion’ (RHI) task. It was previously shown that alexithymia correlates with RHI, and we suggested that emotional intelligence should also be a top-down factor of body ownership, since it was not shown in previous experiments. We elaborated the study of Grynberg and Pollatos [Front. Hum. Neurosci.9(2015) 357] with an additional measure of emotional intelligence, and propose an explanation for the interrelation of emotion and body ownership processing. Eighty subjects took part in the RHI experiment and completed the Toronto Alexithymia Scale and the Mayer–Salovey–Caruso Emotional Intelligence Test (MSCEIT). Only MSCEIT was detected to be a significant predictor of the subjective measure of the RHI. There were no significant correlations between alexithymia scores and the test statements of the RHI or the proprioceptive drift, thus we did not replicate the results of Grynberg and Pollatos. However, alexithymia correlated with the control statements of subjective reports of the illusion, which might be explained as a disruption of the ability to discriminate and describe bodily experience. Therefore, (1) alexithymia seems to be connected with difficulties in conscious or verbal processing of body-related information, and (2) higher emotional intelligence might improve multisensory integration of body-related signals and reflect better predictive models of self-processing.


2016 ◽  
Vol 2016 ◽  
pp. 1-9 ◽  
Author(s):  
Woong Choi ◽  
Liang Li ◽  
Satoru Satoh ◽  
Kozaburo Hachimura

Improving the sense of immersion is one of the core issues in virtual reality. Perceptual illusions of ownership can be perceived over a virtual body in a multisensory virtual reality environment. Rubber Hand and Virtual Hand Illusions showed that body ownership can be manipulated by applying suitable visual and tactile stimulation. In this study, we investigate the effects of multisensory integration in the Virtual Hand Illusion with active movement. A virtual xylophone playing system which can interactively provide synchronous visual, tactile, and auditory stimulation was constructed. We conducted two experiments regarding different movement conditions and different sensory stimulations. Our results demonstrate that multisensory integration with free active movement can improve the sense of immersion in virtual reality.


2021 ◽  
Author(s):  
Peter Lush

Seeing a fake hand brushed in synchrony with brushstrokes to a participant’s hand (the rubber hand illusion; RHI) prompts reports of referred touch, illusory ownership and that the real hand has drifted toward the fake hand (proprioceptive drift). According to one theory, RHI effects are attributable to multisensory integration mechanisms, but they may alternatively (or additionally) reflect the generation of experience to meet expectancies arising from demand characteristics (phenomenological control). Multisensory integration accounts are supported by contrasting synchronous and asynchronous brush stroking conditions, typically presented in counter-balanced order. This contrast is known to be confounded by demand characteristics, but to date there has been no exploration of the role of demand characteristics relating to condition-order. In an exploratory study, existing data from a rubber hand study (n = 124) were analysed to test order effects. Synchronous condition illusion report and the difference between synchronous and asynchronous conditions in both report and proprioceptive drift were greater when the asynchronous condition was performed first (and therefore participants were exposed to the questionnaire materials). These order effects have implications for interpretation of reports of ownership experience: in particular, there was no mean ownership agreement in the synchronous-first group. These data support the theory that reports of ownership of a rubber hand are at least partially attributable to phenomenological control in response to demand characteristics.


Perception ◽  
2021 ◽  
pp. 030100662110588
Author(s):  
Max Teaford ◽  
Jason Gilliland ◽  
Olivia Hodkey ◽  
Tara McVeigh ◽  
Caleb Perry ◽  
...  

The Rubber Foot Illusion (RFI) is an illusion in which one is made to feel that a model foot is their own through synchronous visuo-tactile stimulation. Previous research suggests that the conditions the RFI can be elicited under are similar to those of the Rubber Hand Illusion (RHI). However, it was unknown if the RFI could be elicited by synchronous movement of a participant’s foot and a model foot. To examine this, we developed the Moving Rubber Foot Illusion (mRFI) and compared participants’ experience of it to the RFI. The results of this study suggests that the RFI can be elicited through synchronous movement, and results in more proprioceptive drift than a static variant of the RFI. More work is needed to better understand the mechanisms underlying the mRFI.


2019 ◽  
Author(s):  
Raúl Hernández-Pérez ◽  
Eduardo Rojas-Hortelano ◽  
Victor de Lafuente

AbstractOur choices are often informed by temporally integrating streams of sensory information. This has been well demonstrated in the visual and auditory domains, but the integration of tactile information over time has been less studied. We designed an active touch task in which subjects explored a spheroid-shaped object to determine its inclination with respect to the horizontal plane (inclined to the left or to the right). In agreement with previous findings, our results show that more errors, and longer decision times, accompany difficult decisions (small inclination angles). To gain insight into the decision-making process, we used a task in which the time available for tactile exploration was varied by the experimenter, in a trial-by-trial basis. The behavioral results were fit with a model of bounded accumulation, and also with an independent-sampling model which assumes no sensory accumulation. The results of model fits favor an accumulation-to-bound mechanism, and suggest that participants integrate the first 600 ms of 1800 ms-long stimuli. This means that the somatosensory system benefits from longer streams of information although it does not make use of all available evidence.HighlightsThe somatosensory system integrates information streams through time.Somatosensory discrimination thresholds decrease with longer stimuli.A bounded accumulation model is favored over independent sampling.Humans accumulate up to 600 ms, out of 1800 ms-long stimuli.


2019 ◽  
Author(s):  
Meike Scheller ◽  
Michael J. Proulx ◽  
Michelle de Haan ◽  
Annegret Dahlmann-Noor ◽  
Karin Petrini

AbstractIntegrating different senses to reduce sensory uncertainty and increase perceptual precision can have an important compensatory function for individuals with visual impairment and blindness. However, how visual impairment and blindness impact the development of optimal multisensory integration in the remaining senses is currently unknown. Here we first examined how audio-haptic integration develops and changes across the life span in 92 sighted (blindfolded) individuals between 7 to 70 years of age by using a child-friendly size discrimination task. We assessed whether audio-haptic performance resulted in a reduction of perceptual uncertainty compared to auditory-only and haptic-only performance as predicted by maximum-likelihood estimation model. We then tested how this ability develops in 28 children and adults with different levels of visual experience, focussing on low vision individuals, and blind individuals that lost their sight at different ages during development. Our results show that in sighted individuals, adult-like audio-haptic integration develops around 13-15 years of age, and remains stable until late adulthood. While early blind individuals, even at the youngest ages, integrate audio-haptic information in an optimal fashion, late blind individuals do not. Optimal integration in low vision individuals follows a similar developmental trajectory as that of sighted individuals. These findings demonstrate that visual experience is not necessary for optimal audio-haptic integration to emerge, but that consistency of sensory information across development is key for the functional outcome of optimal multisensory integration.Research HighlightsAudio-haptic integration follows principles of statistical optimality in sighted adults, remaining stable until at least 70 years of lifeNear-optimal audio-haptic integration develops at 13-15 years in sighted adolescentsBlindness within the first 8 years of life facilitates the development of optimal audio-haptic integration while blindness after 8 years impairs such developmentSensory consistency in early childhood is crucial for the development of optimal multisensory integration in the remaining senses


Sign in / Sign up

Export Citation Format

Share Document