University of Hertfordshire

View graph of relations
Original languageEnglish
Article numbere13430
JournalPsychophysiology
Journal publication date1 Oct 2019
Volume56
Issue10
Early online date8 Jul 2019
DOIs
Publication statusPublished - 1 Oct 2019

Abstract

The vestibular system has been shown to contribute to multisensory integration by balancing conflictual sensory information. It remains unclear whether such modulation of exteroceptive (e.g., vision), proprioceptive, and interoceptive (e.g., affective touch) sensory sources is influenced by epistemically different aspects of tactile stimulation (i.e., felt from within vs. seen, vicarious touch). In the current study, we aimed to (a) replicate previous findings regarding the effects of galvanic stimulation of the right vestibular network in multisensory integration, and (b) examine vestibular contributions to multisensory integration when touch is felt but not seen (and vice versa). During artificial vestibular stimulation (LGVS, i.e., right vestibular stimulation), RGVS (i.e., bilateral stimulation), and sham (i.e., placebo stimulation), healthy participants (N = 36, Experiment 1; N = 37, Experiment 2) looked at a rubber hand while either their own unseen hand or the rubber hand were touched by affective or neutral touch. We found that (a) LGVS led to enhancement of vision over proprioception during visual only conditions (replicating our previous findings), and (b) LGVS (versus sham) favored proprioception over vision when touch was felt (Experiment 1), with the opposite results when touch was vicariously perceived via vision (Experiment 2) and with no difference between affective and neutral touch. We showed how vestibular signals modulate the weight of each sensory modality according to the context in which they are perceived and that such modulation extends to different aspects of tactile stimulation: felt and seen touch are differentially balanced in multisensory integration according to their epistemic relevance.

Notes

© 2019 Society for Psychophysiological Research.

ID: 17080546