• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

使用混合现实以及口舌界面实现对多余机器人肢体的共享控制。

Shared Control of Supernumerary Robotic Limbs Using Mixed Realityand Mouth-and-Tongue Interfaces.

作者信息

Jing Hongwei, Zhao Sikai, Zheng Tianjiao, Li Lele, Zhang Qinghua, Sun Kerui, Zhao Jie, Zhu Yanhe

机构信息

State Key Laboratory of Robotics and System, Harbin Institute of Technology, Harbin 150006, China.

出版信息

Biosensors (Basel). 2025 Jan 23;15(2):70. doi: 10.3390/bios15020070.

DOI:10.3390/bios15020070
PMID:39996972
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC11853150/
Abstract

Supernumerary Robotic Limbs (SRLs) are designed to collaborate with the wearer, enhancing operational capabilities. When human limbs are occupied with primary tasks, controlling SRLs flexibly and naturally becomes a challenge. Existing methods such as electromyography (EMG) control and redundant limb control partially address SRL control issues. However, they still face limitations like restricted degrees of freedom and complex data requirements, which hinder their applicability in real-world scenarios. Additionally, fully autonomous control methods, while efficient, often lack the flexibility needed for complex tasks, as they do not allow for real-time user adjustments. In contrast, shared control combines machine autonomy with human input, enabling finer control and more intuitive task completion. Building on our previous work with the mouth-and-tongue interface, this paper integrates a mixed reality (MR) device to form an interactive system that enables shared control of the SRL. The system allows users to dynamically switch between voluntary and autonomous control, providing both flexibility and efficiency. A random forest model classifies 14 distinct tongue and mouth operations, mapping them to six-degree-of-freedom SRL control. In comparative experiments involving ten healthy subjects performing assembly tasks under three control modes (shared control, autonomous control, and voluntary control), shared control demonstrates a balance between machine autonomy and human input. While autonomous control offers higher task efficiency, shared control achieves greater task success rates and improves user experience by combining the advantages of both autonomous operation and voluntary control. This study validates the feasibility of shared control and highlights its advantages in providing flexible switching between autonomy and user intervention, offering new insights into SRL control.

摘要

超冗余机器人肢体(SRLs)旨在与佩戴者协作,增强操作能力。当人类肢体忙于执行主要任务时,灵活自然地控制SRLs就成为了一项挑战。现有的方法,如肌电图(EMG)控制和冗余肢体控制,部分解决了SRL控制问题。然而,它们仍然面临诸如自由度受限和数据要求复杂等局限性,这阻碍了它们在实际场景中的应用。此外,完全自主控制方法虽然高效,但往往缺乏复杂任务所需的灵活性,因为它们不允许用户进行实时调整。相比之下,共享控制将机器自主性与人类输入相结合,实现更精细的控制和更直观的任务完成。基于我们之前在口舌界面方面的工作,本文集成了一个混合现实(MR)设备,形成了一个能够对SRL进行共享控制的交互系统。该系统允许用户在自主控制和手动控制之间动态切换,提供了灵活性和效率。一个随机森林模型对14种不同的舌部和口部动作进行分类,将它们映射到六自由度的SRL控制上。在涉及10名健康受试者在三种控制模式(共享控制、自主控制和手动控制)下执行装配任务的对比实验中,共享控制展示了机器自主性与人类输入之间的平衡。虽然自主控制提供了更高的任务效率,但共享控制通过结合自主操作和手动控制的优势,实现了更高的任务成功率并改善了用户体验。这项研究验证了共享控制的可行性,并突出了其在自主与用户干预之间灵活切换方面的优势,为SRL控制提供了新的见解。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/fc09/11853150/44574fe4c017/biosensors-15-00070-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/fc09/11853150/c6d890d8f973/biosensors-15-00070-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/fc09/11853150/f312f9112732/biosensors-15-00070-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/fc09/11853150/aa5a768958da/biosensors-15-00070-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/fc09/11853150/91145344cc45/biosensors-15-00070-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/fc09/11853150/ab89a7a17681/biosensors-15-00070-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/fc09/11853150/8073f4b7f5b3/biosensors-15-00070-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/fc09/11853150/30ac83950607/biosensors-15-00070-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/fc09/11853150/1255b2caf9d2/biosensors-15-00070-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/fc09/11853150/44574fe4c017/biosensors-15-00070-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/fc09/11853150/c6d890d8f973/biosensors-15-00070-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/fc09/11853150/f312f9112732/biosensors-15-00070-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/fc09/11853150/aa5a768958da/biosensors-15-00070-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/fc09/11853150/91145344cc45/biosensors-15-00070-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/fc09/11853150/ab89a7a17681/biosensors-15-00070-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/fc09/11853150/8073f4b7f5b3/biosensors-15-00070-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/fc09/11853150/30ac83950607/biosensors-15-00070-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/fc09/11853150/1255b2caf9d2/biosensors-15-00070-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/fc09/11853150/44574fe4c017/biosensors-15-00070-g009.jpg

相似文献

1
Shared Control of Supernumerary Robotic Limbs Using Mixed Realityand Mouth-and-Tongue Interfaces.使用混合现实以及口舌界面实现对多余机器人肢体的共享控制。
Biosensors (Basel). 2025 Jan 23;15(2):70. doi: 10.3390/bios15020070.
2
A Mouth and Tongue Interactive Device to Control Wearable Robotic Limbs in Tasks where Human Limbs Are Occupied.用于控制可穿戴机器人肢体的口腔和舌头交互设备,应用于人类肢体被占用的任务中。
Biosensors (Basel). 2024 Apr 24;14(5):213. doi: 10.3390/bios14050213.
3
EMG-driven shared human-robot compliant control for in-hand object manipulation in hand prostheses.肌电驱动的共享人与机器人顺应控制在手假体中进行手中物体操作。
J Neural Eng. 2022 Dec 2;19(6). doi: 10.1088/1741-2552/aca35f.
4
Human Operation Augmentation through Wearable Robotic Limb Integrated with Mixed Reality Device.通过与混合现实设备集成的可穿戴机器人肢体增强人类操作。
Biomimetics (Basel). 2023 Oct 8;8(6):479. doi: 10.3390/biomimetics8060479.
5
Semi-Autonomous Robotic Arm Reaching With Hybrid Gaze-Brain Machine Interface.基于混合注视-脑机接口的半自主机器人手臂伸展
Front Neurorobot. 2020 Jan 24;13:111. doi: 10.3389/fnbot.2019.00111. eCollection 2019.
6
An EMG Interface for the Control of Motion and Compliance of a Supernumerary Robotic Finger.用于控制多指机器人手指运动和柔顺性的肌电图接口
Front Neurorobot. 2016 Nov 11;10:18. doi: 10.3389/fnbot.2016.00018. eCollection 2016.
7
Blending of brain-machine interface and vision-guided autonomous robotics improves neuroprosthetic arm performance during grasping.脑机接口与视觉引导自主机器人技术的融合可提高抓握过程中神经假肢手臂的性能。
J Neuroeng Rehabil. 2016 Mar 18;13:28. doi: 10.1186/s12984-016-0134-9.
8
Study on human-SRL synchronized walking based on coupled impedance.基于耦合阻抗的人机同步康复行走研究
Front Neurorobot. 2023 Sep 25;17:1252947. doi: 10.3389/fnbot.2023.1252947. eCollection 2023.
9
A Multimodal Intention Detection Sensor Suite for Shared Autonomy of Upper-Limb Robotic Prostheses.多模态意图检测传感器套件,用于上肢机器人假肢的共享自主性。
Sensors (Basel). 2020 Oct 27;20(21):6097. doi: 10.3390/s20216097.
10
Continuous Tongue Robot Mapping for Paralyzed Individuals Improves the Functional Performance of Tongue-Based Robotic Assistance.连续舌机器人映射对瘫痪个体有益,可以提高基于舌的机器人辅助的功能性能。
IEEE Trans Biomed Eng. 2021 Aug;68(8):2552-2562. doi: 10.1109/TBME.2021.3055250. Epub 2021 Jul 16.

本文引用的文献

1
A Mouth and Tongue Interactive Device to Control Wearable Robotic Limbs in Tasks where Human Limbs Are Occupied.用于控制可穿戴机器人肢体的口腔和舌头交互设备,应用于人类肢体被占用的任务中。
Biosensors (Basel). 2024 Apr 24;14(5):213. doi: 10.3390/bios14050213.
2
Human Operation Augmentation through Wearable Robotic Limb Integrated with Mixed Reality Device.通过与混合现实设备集成的可穿戴机器人肢体增强人类操作。
Biomimetics (Basel). 2023 Oct 8;8(6):479. doi: 10.3390/biomimetics8060479.
3
A wearable textile-based pneumatic energy harvesting system for assistive robotics.
一种用于辅助机器人技术的基于可穿戴纺织品的气动能量收集系统。
Sci Adv. 2022 Aug 26;8(34):eabo2418. doi: 10.1126/sciadv.abo2418. Epub 2022 Aug 24.
4
Wearable Supernumerary Robotic Limb System Using a Hybrid Control Approach Based on Motor Imagery and Object Detection.基于运动想象和目标检测的混合控制方法的可穿戴冗余机器人肢体系统。
IEEE Trans Neural Syst Rehabil Eng. 2022;30:1298-1309. doi: 10.1109/TNSRE.2022.3172974. Epub 2022 May 27.
5
Semi-Automated Control System for Reaching Movements in EMG Shoulder Disarticulation Prosthesis Based on Mixed Reality Device.基于混合现实设备的肌电控制肩关节离断假肢伸展运动半自动控制系统
IEEE Open J Eng Med Biol. 2021 Feb 9;2:55-64. doi: 10.1109/OJEMB.2021.3058036. eCollection 2021.
6
Principles of human movement augmentation and the challenges in making it a reality.人类运动增强的原则及使其成为现实所面临的挑战。
Nat Commun. 2022 Mar 15;13(1):1345. doi: 10.1038/s41467-022-28725-7.
7
BMI control of a third arm for multitasking.用于多任务处理的第三条手臂的 BMI 控制。
Sci Robot. 2018 Jul 25;3(20). doi: 10.1126/scirobotics.aat1228.