CrossModal Correspondence based MultisensoryIntegration: A pilot study showing how HAV cues can modulate the reaction time

Read the full article See related articles

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

We live in a multisensory world, where all our senses work together for giving us a fulfilling experience of the environment that we are in or during our use of immersive technologies.

For gaining more insight into the temporal scale understanding of the integration phenomenon EEG based BCI can give us the understanding of the transient changes in the brain.

In this study, we investigated the potential of incorporating haptics into crossmodal correspondence based research to induce MSI effect through either the active touch users’ feedback or crossmodal correspondences with visual and auditory modalities, such as Kiki Bouba effect.

We designed two experiments:

  • Visual stimuli were presented on a standard computer monitor, and auditory stimuli were delivered through computer dynamics. Participants responded using left or right hand by pressing either CapsLock or Enter buttons respectively. Visual cue consisted of a red circle displayed randomly either on the left or on the right side of the screen. Auditory cue was a brief high tone presented through left or right headphones for 500 ms. Text stimuli that appeared on the screen instructed participants to respond with their left or right hand. Before each trial there was a fixation central cross displayed for 500 ms.

  • This experiment was inspired by previous studies on Kiki-Bouba correspondence. Visual stimuli consisted of 4 shapes - circle, triangle, polygon with 6 vertices, and star - presented on a computer screen. Locations of the visual stimuli were randomized. Auditory stimuli were generated using the Online Tone Generator website ( https://onlinetonegenerator.com/ ). 2 sets of sounds were used: the first set included sine, triangle, square, and sawtooth waveforms, each at a frequency of 500 Hz; the second set included sawtooth waveforms at frequencies of 50 Hz, 300 Hz, 600 Hz, and 2000 Hz (summarised in Table 2).

  • Results suggested that it is indeed possible to achieve this type of integration without relying on complex haptic devices. Introducing haptics into BCI technologies through feedback touch or crossmodal correspondances holds potential to improve the user experience and information transfer rate (ITR).

    Participants, as expected, showed the lowest reaction times in congruent sequential test and the highest – in incongruent HAV cues based test. This indicates the importance preference for sequential cue presentation over simultaneous one. The time was significantly higher in case of Incongruent Haptic cues.

    Article activity feed