Noccaro A, Pinardi M, Formica D, Pino G Di
Annu Int Conf IEEE Eng Med Biol Soc. 2020 Jul;2020:3244-3247. doi: 10.1109/EMBC44109.2020.9176387.
A unique virtual reality platform for multisensory integration studies is presented. It allows to provide multimodal sensory stimuli (i.e. auditory, visual, tactile, etc.) ensuring temporal coherence, key factor in cross-modal integration. Four infrared cameras allow to real-time track the human motion and correspondingly control a virtual avatar. A user-friendly interface allows to manipulate a great variety of features (i.e. stimulus type, duration and distance from the participants' body, as well as avatar gender, height, arm pose, perspective, etc.) and to real-time provide quantitative measures of all the parameters. The platform has been validated on two healthy participants testing a reaction time task which combines tactile and visual stimuli, for the investigation of peripersonal space. Results proved the effectiveness of the proposed platform, showing a significant correlation (p=0.013) between the participant's hand distance from the visual stimulus and the reaction time to the tactile stimulus. More participants will be recruited to further investigate the other measures provided by the platform.
本文介绍了一个用于多感官整合研究的独特虚拟现实平台。它能够提供多模态感官刺激(如听觉、视觉、触觉等),确保时间连贯性,这是跨模态整合的关键因素。四个红外摄像头可实时跟踪人体运动,并相应地控制虚拟化身。用户友好的界面允许操作各种功能(如刺激类型、持续时间和与参与者身体的距离,以及化身的性别、身高、手臂姿势、视角等),并实时提供所有参数的定量测量。该平台已在两名健康参与者身上进行了验证,他们测试了一项结合触觉和视觉刺激的反应时间任务,以研究个人周边空间。结果证明了所提出平台的有效性,显示参与者手部与视觉刺激的距离与对触觉刺激的反应时间之间存在显著相关性(p = 0.013)。将招募更多参与者以进一步研究该平台提供的其他测量方法。