Zick Lucas Alexandre, Martinelli Dieisson, Schneider de Oliveira André, Cremer Kalempa Vivian
Graduate Program in Electrical and Computer Engineering, Universidade Tecnolágica Federal do Paraná (UTFPR), Curitiba, 80230-901, Brazil.
Department of Information Systems, Universidade do Estado de Santa Catarina (UDESC), São Bento do Sul, 89283-081, Brazil.
Sci Rep. 2024 Dec 4;14(1):30230. doi: 10.1038/s41598-024-80898-x.
Robotic teleoperation is essential for hazardous environments where human safety is at risk. However, efficient and intuitive human-machine interaction for multi-robot systems remains challenging. This article aims to demonstrate a robotic teleoperation system, denominated AutoNav, centered around autonomous navigation and gesture commands interpreted through computer vision. The central focus is on recognizing the palm of the hand as a control interface to facilitate human-machine interaction in the context of multi-robots. The MediaPipe framework was integrated to implement gesture recognition from a USB camera. The system was developed using the Robot Operating System, employing a simulated environment that includes the Gazebo and RViz applications with multiple TurtleBot 3 robots. The main results show a reduction of approximately 50% in the execution time, coupled with an increase in free time during teleoperation, reaching up to 94% of the total execution time. Furthermore, there is a decrease in collisions. These results demonstrate the effectiveness and practicality of the robotic control algorithm, showcasing its promise in managing teleoperations across multi-robots. This study fills a knowledge gap by developing a hand gesture-based control interface for more efficient and safer multi-robot teleoperation. These findings enhance human-machine interaction in complex robotic operations. A video showing the system working is available at https://youtu.be/94S4nJ3IwUw .
机器人遥操作对于人类安全面临风险的危险环境至关重要。然而,多机器人系统的高效且直观的人机交互仍然具有挑战性。本文旨在展示一个名为AutoNav的机器人遥操作系统,该系统围绕自主导航和通过计算机视觉解释的手势命令展开。核心重点是将手掌识别为控制界面,以便在多机器人环境中促进人机交互。集成了MediaPipe框架以实现从USB摄像头进行手势识别。该系统是使用机器人操作系统开发的,采用了一个模拟环境,其中包括带有多个TurtleBot 3机器人的Gazebo和RViz应用程序。主要结果表明,执行时间减少了约50%,同时遥操作期间的空闲时间增加,达到总执行时间的94%。此外,碰撞次数减少。这些结果证明了机器人控制算法的有效性和实用性,展示了其在管理多机器人遥操作方面的前景。本研究通过开发基于手势的控制界面以实现更高效、更安全的多机器人遥操作,填补了知识空白。这些发现增强了复杂机器人操作中的人机交互。展示该系统工作的视频可在https://youtu.be/94S4nJ3IwUw查看。