Rantanen Ville, Vanhala Toni, Tuisku Outi, Niemenlehto Pekka-Henrik, Verho Jarmo, Surakka Veikko, Juhola Martti, Lekkala Jukka
Department of Automation Science and Engineering, Tampere University of Technology, Tampere, Finland.
IEEE Trans Inf Technol Biomed. 2011 Sep;15(5):795-801. doi: 10.1109/TITB.2011.2158321. Epub 2011 May 31.
A light-weight, wearable, wireless gaze tracker with integrated selection command source for human-computer interaction is introduced. The prototype system combines head-mounted, video-based gaze tracking with capacitive facial movement detection that enable multimodal interaction by gaze pointing and making selections with facial gestures. The system is targeted mainly to disabled people with limited mobility over their hands. The hardware was made wireless to remove the need to take off the device when moving away from the computer, and to allow future use in more mobile contexts. The algorithms responsible for determining the eye and head orientations to map gaze direction to on-screen coordinates are presented together with the one to detect movements from the measured capacitance signal. Point-and-click experiments were conducted to assess the performance of the multimodal system. The results show decent performance in laboratory and office conditions. The overall point-and-click accuracy in the multimodal experiments is comparable to the errors in previous research on head-mounted, single modality gaze tracking that does not compensate for changes in head orientation.
介绍了一种用于人机交互的集成选择命令源的轻型、可穿戴无线视线跟踪器。该原型系统将头戴式、基于视频的视线跟踪与电容式面部运动检测相结合,通过视线指向和面部手势选择实现多模态交互。该系统主要面向手部活动受限的残疾人。硬件采用无线设计,无需在离开计算机时摘下设备,并允许未来在更多移动场景中使用。给出了负责确定眼睛和头部方向以将视线方向映射到屏幕坐标的算法,以及用于从测量的电容信号中检测运动的算法。进行了点击实验以评估多模态系统的性能。结果表明,在实验室和办公室条件下该系统性能良好。多模态实验中的总体点击准确率与先前关于不补偿头部方向变化的头戴式单模态视线跟踪研究中的误差相当。