Boyer Eric O, Portron Arthur, Bevilacqua Frederic, Lorenceau Jean
STMS Lab, IRCAM - Centre National de la Recherche Scientifique - UPMCParis, France.
Laboratoire des Systèmes Perceptifs, LSP Centre National de la Recherche Scientifique (CNRS), UMR8248, Département d'Etudes Cognitives, Ecole Normale Supérieure-PSLParis, France.
Front Neurosci. 2017 Apr 25;11:197. doi: 10.3389/fnins.2017.00197. eCollection 2017.
As eye movements are mostly automatic and overtly generated to attain visual goals, individuals have a poor metacognitive knowledge of their own eye movements. We present an exploratory study on the effects of real-time continuous auditory feedback generated by eye movements. We considered both a tracking task and a production task where smooth pursuit eye movements (SPEM) can be endogenously generated. In particular, we used a visual paradigm which enables to generate and control SPEM in the absence of a moving visual target. We investigated whether real-time auditory feedback of eye movement dynamics might improve learning in both tasks, through a training protocol over 8 days. The results indicate that real-time sonification of eye movements can actually modify the oculomotor behavior, and reinforce intrinsic oculomotor perception. Nevertheless, large inter-individual differences were observed preventing us from reaching a strong conclusion on sensorimotor learning improvements.
由于眼球运动大多是自动的,并且明显是为了实现视觉目标而产生的,个体对自己的眼球运动缺乏元认知知识。我们进行了一项探索性研究,以探讨眼球运动产生的实时连续听觉反馈的效果。我们考虑了一个跟踪任务和一个生产任务,在这两个任务中可以内源性地产生平稳跟踪眼球运动(SPEM)。具体来说,我们使用了一种视觉范式,该范式能够在没有移动视觉目标的情况下产生和控制SPEM。我们通过一个为期8天的训练方案,研究了眼球运动动态的实时听觉反馈是否能改善这两个任务中的学习。结果表明,眼球运动的实时声音化实际上可以改变眼动行为,并增强内在的眼动感知。然而,观察到个体间存在较大差异,这使我们无法就感觉运动学习的改善得出强有力的结论。