This project will explore how touch can be integrated with sound and vision in next-generation multisensory human-computer interfaces combining tactile, auditory and visual feedback. It aims to characterize how human integrate multisensory cues into a unified percept.
Specifically, we will explore through psychophysical and EEG measurements how reinforcement and disruption of haptic shape or texture representation (e.g. the shape of a button on a display or its texture) occurs when tactile, auditory, and visual cues are independently modulated.