Understanding the neural correlates of visual-tactile multisensory integration

Background & Questions

Multisensory integration is the ability of the brain to integrate multiple streams of sensory input into a single percept (e.g. integrating sound and visual input to localize where a sound is coming from). Little is known about visual-tactile multisensory integration, or in other words, how our brain integrates viewed and felt touch to the body.

Given this concept, we set out to answer two primary questions:

  1. What is the effect of viewed touch on felt touch behaviorally?
  2. Using advanced neuroimaging analyses, what are the neural correlates of visuotactile multisensory integration?

For more details about the background and motivation involved in setting up this study, check out my poster presentation on the topic.

Method

To answer the questions above, we developed a two-part experiment. In Experiment 1, we designed a tactile localization experiment in which participants simultaneously viewed and felt touch to one of four fingers on their hand. In Experiment 2, we designed a multi-part neuroimaging experiment with a similar task, that also included two functional localizer tasks (to find regions of interest in the brain to be used for analyses).

Both experiments underwent pilot testing to finalize methodology choices. We combined information learned from both experiments to inform our observations about the study findings.

Important Findings

The results from Experiment 1 were consistent with our predictions, and we confirmed that viewed touch had a significant effect on felt touch. In the graphical representation to the right, histogram plots display the percentage of responses across all possible trial types.

Limitations during Experiment 2 resulted in mixed evidence about the neural correlates of visuotactile integration.

Impact

Our research contributed to a growing body of literature on multisensory integration, specifically within the visual-tactile domain of sensory systems.

Future research studying visuotactile integration should also study additional variables involved during the process of multisensory integration (i.e. reliability of the stimulus, different body postures). Additionally, this research should continue to take into account the neural mechanisms involved in this process, since this is understudied in the field.

My Learnings

  • Running a two-part study takes time! Prepare accordingly and be flexible with your timeline if possible.
  • Document everything you plan to do and everything you actually did in excruciating detail.

Software/Tools: R/RStudio, FMRIB Software Library (FSL), MATLAB, E-Prime, Microsoft Excel
Analysis Techniques: Linear Mixed-Effects Modeling, Analysis of Deviance/Model Fitting, Event-Related Analysis using General Linear Models (GLMs), Multi-Voxel Pattern Analysis (MVPA)


Detailed Summary: Multisensory integration is known as the brain’s ability to integrate separate streams of sensory input into a single percept and is a cognitive process we use every day to form our experiences. In order to better understand the behavioral and neural mechanisms of visuotactile integration we conducted a two-part experiment. To establish whether viewed touch influenced felt touch, we first ran a behavioral experiment in which subjects viewed touches at the same time they felt vibrotactile stimuli on their own hand. Touches either occurred on the same finger in both modalities (congruency trials) or were in different locations (incongruent trials). When asked to respond on which finger they felt the tactile stimulus, subjects were significantly more accurate at localizing touch when the touches were congruent, as predicted. A similar paradigm was conducted in the scanner, and analyzed using event-related fMRI and MVPA to examine what brain regions are active during visuotactile integration. No regions were significantly active during congruent versus incongruent trials, however incongruent trials resulted in significant activation in right DLPFC, right OFC, and ACC compared to congruent trials. Significant activation in these areas provides preliminary evidence of higher-order processing involved during incongruent touch. MVPA using a whole-brain searchlight did not result in a significant ability to decode between congruence and incongruence of touch.

For more details about this project, please see my senior thesis.