Ma Meng, Fallavollita Pascal, Habert Séverine, Weidert Simon, Navab Nassir
Fakultät für Informatik, Technische Universität München, I-16, Boltzmannstr. 3, 85748, Garching b. München, Germany.
National University of Defense Technology, Changsha, China.
Int J Comput Assist Radiol Surg. 2016 Jun;11(6):853-61. doi: 10.1007/s11548-016-1375-6. Epub 2016 Mar 16.
In the modern day operating room, the surgeon performs surgeries with the support of different medical systems that showcase patient information, physiological data, and medical images. It is generally accepted that numerous interactions must be performed by the surgical team to control the corresponding medical system to retrieve the desired information. Joysticks and physical keys are still present in the operating room due to the disadvantages of mouses, and surgeons often communicate instructions to the surgical team when requiring information from a specific medical system. In this paper, a novel user interface is developed that allows the surgeon to personally perform touchless interaction with the various medical systems, switch effortlessly among them, all of this without modifying the systems' software and hardware.
To achieve this, a wearable RGB-D sensor is mounted on the surgeon's head for inside-out tracking of his/her finger with any of the medical systems' displays. Android devices with a special application are connected to the computers on which the medical systems are running, simulating a normal USB mouse and keyboard. When the surgeon performs interaction using pointing gestures, the desired cursor position in the targeted medical system display, and gestures, are transformed into general events and then sent to the corresponding Android device. Finally, the application running on the Android devices generates the corresponding mouse or keyboard events according to the targeted medical system.
To simulate an operating room setting, our unique user interface was tested by seven medical participants who performed several interactions with the visualization of CT, MRI, and fluoroscopy images at varying distances from them. Results from the system usability scale and NASA-TLX workload index indicated a strong acceptance of our proposed user interface.
在现代手术室中,外科医生在不同医疗系统的支持下进行手术,这些系统可展示患者信息、生理数据和医学图像。人们普遍认为,手术团队必须进行大量交互操作来控制相应的医疗系统以获取所需信息。由于鼠标存在缺点,手术室中仍使用操纵杆和物理按键,并且外科医生在从特定医疗系统获取信息时经常向手术团队传达指令。在本文中,开发了一种新颖的用户界面,使外科医生能够亲自与各种医疗系统进行非接触式交互,并在它们之间轻松切换,所有这些操作均无需修改系统的软件和硬件。
为实现这一目标,将一个可穿戴的RGB-D传感器安装在外科医生的头部,用于从外向内跟踪他/她的手指与任何医疗系统显示屏的位置。带有特殊应用程序的安卓设备连接到运行医疗系统的计算机上,模拟普通的USB鼠标和键盘。当外科医生使用指向手势进行交互时,目标医疗系统显示屏中所需的光标位置和手势会被转换为一般事件,然后发送到相应的安卓设备。最后,安卓设备上运行的应用程序根据目标医疗系统生成相应的鼠标或键盘事件。
为模拟手术室环境,七名医学参与者对我们独特的用户界面进行了测试,他们在不同距离与CT、MRI和荧光透视图像的可视化进行了多次交互。系统可用性量表和NASA-TLX工作量指数的结果表明,我们提出的用户界面得到了强烈认可。