Mewes A, Saalfeld P, Riabikin O, Skalej M, Hansen C
Computer-Assisted Surgery Group, Faculty of Computer Science, Institute of Simulation and Graphics, University of Magdeburg, PF 4120, 39106, Magdeburg, Germany.
Visualization Group, Faculty of Computer Science, Institute of Simulation and Graphics, University of Magdeburg, PF 4120, 39106, Magdeburg, Germany.
Int J Comput Assist Radiol Surg. 2016 Jan;11(1):157-64. doi: 10.1007/s11548-015-1215-0. Epub 2015 May 10.
The interaction with interventional imaging systems within a sterile environment is a challenging task for physicians. Direct physician-machine interaction during an intervention is rather limited because of sterility and workspace restrictions.
We present a gesture-controlled projection display that enables a direct and natural physician-machine interaction during computed tomography (CT)-based interventions. Therefore, a graphical user interface is projected on a radiation shield located in front of the physician. Hand gestures in front of this display are captured and classified using a leap motion controller. We propose a gesture set to control basic functions of intervention software such as gestures for 2D image exploration, 3D object manipulation and selection. Our methods were evaluated in a clinically oriented user study with 12 participants.
The results of the performed user study confirm that the display and the underlying interaction concept are accepted by clinical users. The recognition of the gestures is robust, although there is potential for improvements. The gesture training times are less than 10 min, but vary heavily between the participants of the study. The developed gestures are connected logically to the intervention software and intuitive to use.
The proposed gesture-controlled projection display counters current thinking, namely it gives the radiologist complete control of the intervention software. It opens new possibilities for direct physician-machine interaction during CT-based interventions and is well suited to become an integral part of future interventional suites.
在无菌环境中与介入成像系统进行交互对医生来说是一项具有挑战性的任务。由于无菌要求和工作空间限制,介入过程中医生与机器的直接交互相当有限。
我们展示了一种手势控制的投影显示器,它能够在基于计算机断层扫描(CT)的介入过程中实现直接且自然的医生与机器交互。因此,图形用户界面被投影到医生前方的辐射防护屏上。使用Leap Motion控制器捕捉并分类该显示屏前的手部动作。我们提出了一组手势来控制介入软件的基本功能,如用于二维图像探索、三维物体操作和选择的手势。我们的方法在一项面向临床的用户研究中对12名参与者进行了评估。
所进行的用户研究结果证实,该显示屏及底层交互概念为临床用户所接受。尽管仍有改进空间,但手势识别效果稳健。手势训练时间不到10分钟,但研究参与者之间差异很大。所开发的手势与介入软件逻辑相连且使用直观。
所提出的手势控制投影显示器与当前观念不同,它让放射科医生能够完全控制介入软件。它为基于CT的介入过程中的直接医生与机器交互开辟了新的可能性,非常适合成为未来介入手术室不可或缺的一部分。