College of Artificial Intelligence, Nankai University, Tianiin 300353, China.
Institute of Intelligence Technology and Robotic Systems, Shenzhen Research Institute of Nankai University, Shenzhen 518083, China.
Sensors (Basel). 2023 Oct 22;23(20):8637. doi: 10.3390/s23208637.
The robotic surgery environment represents a typical scenario of human-robot cooperation. In such a scenario, individuals, robots, and medical devices move relative to each other, leading to unforeseen mutual occlusion. Traditional methods use binocular OTS to focus on the local surgical site, without considering the integrity of the scene, and the work space is also restricted. To address this challenge, we propose the concept of a fully perception robotic surgery environment and build a global-local joint positioning framework. Furthermore, based on data characteristics, an improved Kalman filter method is proposed to improve positioning accuracy. Finally, drawing from the view margin model, we design a method to evaluate positioning accuracy in a dynamic occlusion environment. The experimental results demonstrate that our method yields better positioning results than classical filtering methods.
机器人手术环境代表了一种典型的人机协作场景。在这种场景中,个体、机器人和医疗器械相互移动,导致不可预见的相互遮挡。传统方法使用双目 OTS 来关注局部手术部位,而不考虑场景的完整性,工作空间也受到限制。为了解决这一挑战,我们提出了全感知机器人手术环境的概念,并构建了全局-局部联合定位框架。此外,基于数据特征,我们提出了一种改进的卡尔曼滤波方法来提高定位精度。最后,借鉴视图边距模型,我们设计了一种方法来评估动态遮挡环境中的定位精度。实验结果表明,我们的方法比经典滤波方法具有更好的定位结果。