• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

基于深度感知的下肢假肢用户无关意图识别

User-Independent Intent Recognition for Lower Limb Prostheses Using Depth Sensing.

出版信息

IEEE Trans Biomed Eng. 2018 Aug;65(8):1759-1770. doi: 10.1109/TBME.2017.2776157. Epub 2017 Nov 21.

DOI:10.1109/TBME.2017.2776157
PMID:29989950
Abstract

OBJECTIVE

The intent recognizers of advanced lower limb prostheses utilize mechanical sensors on the prosthesis and/or electromyographic measurements from the residual limb. Besides the delay caused by these signals, such systems require user-specific databases to train the recognizers. In this paper, our objective is the development and validation of a user-independent intent recognition framework utilizing depth sensing.

METHODS

We collected a depth image dataset from 12 healthy subjects engaging in a variety of routine activities. After filtering the depth images, we extracted simple features employing a recursive strategy. The feature vectors were classified using a support vector machine. For robust activity mode switching, we implemented a voting filter scheme.

RESULTS

The model selection showed that the support vector machine classifier with no dimension reduction has the highest classification accuracy. Specifically, it reached 94.1% accuracy on the testing data from four subjects. We also observed a positive trend in the accuracy of classifiers trained with data from increasing the number of subjects. Activity mode switching using a voting filter detected 732 out of 778 activity mode transitions of the four users while initiating 70 erroneous transitions during steady-state activities.

CONCLUSION

The intent recognizer trained on multiple subjects can be used for any other subject, providing a promising solution for supervisory control of powered lower limb prostheses.

SIGNIFICANCE

A user-independent intent recognition framework has the potential to decrease or eliminate the time required for extensive data collection regiments for intent recognizer training. This could accelerate the introduction of robotic lower limb prostheses to the market.

摘要

目的

先进下肢假肢的意图识别器利用假肢上的机械传感器和/或残肢的肌电图测量值。除了这些信号引起的延迟外,此类系统还需要用户特定的数据库来训练识别器。在本文中,我们的目标是开发和验证一种利用深度感应的用户独立意图识别框架。

方法

我们从 12 名健康受试者中收集了一个进行各种日常活动的深度图像数据集。在过滤深度图像后,我们使用递归策略提取简单特征。使用支持向量机对特征向量进行分类。为了实现稳健的活动模式切换,我们实现了投票滤波器方案。

结果

模型选择表明,没有降维的支持向量机分类器具有最高的分类准确性。具体来说,它在来自四个受试者的测试数据上达到了 94.1%的准确率。我们还观察到,随着训练数据中受试者数量的增加,分类器的准确性呈正趋势。使用投票滤波器进行的活动模式切换检测到四个用户中的 732 次活动模式转换,而在稳态活动中发起了 70 次错误转换。

结论

在多个受试者上训练的意图识别器可以用于任何其他受试者,为动力下肢假肢的监督控制提供了有前途的解决方案。

意义

用户独立的意图识别框架有可能减少或消除意图识别器训练所需的广泛数据收集时间。这可以加速机器人下肢假肢推向市场。

相似文献

1
User-Independent Intent Recognition for Lower Limb Prostheses Using Depth Sensing.基于深度感知的下肢假肢用户无关意图识别
IEEE Trans Biomed Eng. 2018 Aug;65(8):1759-1770. doi: 10.1109/TBME.2017.2776157. Epub 2017 Nov 21.
2
Analysis of using EMG and mechanical sensors to enhance intent recognition in powered lower limb prostheses.使用肌电图和机械传感器增强动力下肢假肢意图识别的分析。
J Neural Eng. 2014 Oct;11(5):056021. doi: 10.1088/1741-2560/11/5/056021. Epub 2014 Sep 22.
3
A feasibility study of depth image based intent recognition for lower limb prostheses.基于深度图像的下肢假肢意图识别可行性研究
Annu Int Conf IEEE Eng Med Biol Soc. 2016 Aug;2016:5055-5058. doi: 10.1109/EMBC.2016.7591863.
4
A Classification Method for User-Independent Intent Recognition for Transfemoral Amputees Using Powered Lower Limb Prostheses.一种用于使用动力下肢假肢的经股骨截肢者的与用户无关的意图识别分类方法。
IEEE Trans Neural Syst Rehabil Eng. 2016 Feb;24(2):217-25. doi: 10.1109/TNSRE.2015.2412461. Epub 2015 Mar 16.
5
Multiclass real-time intent recognition of a powered lower limb prosthesis.动力下肢假肢的多类实时意图识别。
IEEE Trans Biomed Eng. 2010 Mar;57(3):542-51. doi: 10.1109/TBME.2009.2034734. Epub 2009 Oct 20.
6
Across-user adaptation for a powered lower limb prosthesis.用于动力下肢假肢的跨用户适配。
IEEE Int Conf Rehabil Robot. 2017 Jul;2017:1580-1583. doi: 10.1109/ICORR.2017.8009473.
7
User intent prediction with a scaled conjugate gradient trained artificial neural network for lower limb amputees using a powered prosthesis.使用缩放共轭梯度训练的人工神经网络对使用动力假肢的下肢截肢者进行用户意图预测。
Annu Int Conf IEEE Eng Med Biol Soc. 2016 Aug;2016:6405-6408. doi: 10.1109/EMBC.2016.7592194.
8
Gradient-Based Multi-Objective Feature Selection for Gait Mode Recognition of Transfemoral Amputees.基于梯度的多目标特征选择在股骨截肢者步态模式识别中的应用。
Sensors (Basel). 2019 Jan 10;19(2):253. doi: 10.3390/s19020253.
9
Swing-phase detection of locomotive mode transitions for smooth multi-functional robotic lower-limb prosthesis control.用于平滑多功能机器人下肢假肢控制的运动模式转换的摆动阶段检测。
Front Robot AI. 2024 Apr 12;11:1267072. doi: 10.3389/frobt.2024.1267072. eCollection 2024.
10
A locomotion intent prediction system based on multi-sensor fusion.一种基于多传感器融合的运动意图预测系统。
Sensors (Basel). 2014 Jul 10;14(7):12349-69. doi: 10.3390/s140712349.

引用本文的文献

1
Ambilateral Activity Recognition and Continuous Adaptation with a Powered Knee-Ankle Prosthesis.使用电动膝踝假肢的双侧活动识别与持续适应
IEEE Trans Robot. 2025;41:2251-2267. doi: 10.1109/tro.2025.3539206. Epub 2025 Feb 5.
2
Review of Vision-Based Environmental Perception for Lower-Limb Exoskeleton Robots.下肢外骨骼机器人基于视觉的环境感知综述
Biomimetics (Basel). 2024 Apr 22;9(4):254. doi: 10.3390/biomimetics9040254.
3
StairNet: visual recognition of stairs for human-robot locomotion.StairNet:用于人机运动的楼梯视觉识别。
Biomed Eng Online. 2024 Feb 15;23(1):20. doi: 10.1186/s12938-024-01216-0.
4
Integrating intention-based systems in human-robot interaction: a scoping review of sensors, algorithms, and trust.将基于意图的系统集成到人机交互中:传感器、算法与信任的范围综述
Front Robot AI. 2023 Oct 9;10:1233328. doi: 10.3389/frobt.2023.1233328. eCollection 2023.
5
Object-of-Interest Perception in a Reconfigurable Rolling-Crawling Robot.目标物感知在可重构滚动爬行机器人中的应用。
Sensors (Basel). 2022 Jul 12;22(14):5214. doi: 10.3390/s22145214.
6
Environment Classification for Robotic Leg Prostheses and Exoskeletons Using Deep Convolutional Neural Networks.使用深度卷积神经网络的机器人腿部假肢和外骨骼的环境分类
Front Neurorobot. 2022 Feb 4;15:730965. doi: 10.3389/fnbot.2021.730965. eCollection 2021.
7
Review of control strategies for lower-limb exoskeletons to assist gait.下肢外骨骼助力行走的控制策略综述。
J Neuroeng Rehabil. 2021 Jul 27;18(1):119. doi: 10.1186/s12984-021-00906-3.
8
ExoNet Database: Wearable Camera Images of Human Locomotion Environments.ExoNet数据库:人类运动环境的可穿戴相机图像
Front Robot AI. 2020 Dec 3;7:562061. doi: 10.3389/frobt.2020.562061. eCollection 2020.
9
Relying on more sense for enhancing lower limb prostheses control: a review.依靠更多的感觉来增强下肢假肢控制:综述。
J Neuroeng Rehabil. 2020 Jul 17;17(1):99. doi: 10.1186/s12984-020-00726-x.
10
A Survey of Teleceptive Sensing for Wearable Assistive Robotic Devices.可穿戴辅助机器人设备的遥感性感知研究综述。
Sensors (Basel). 2019 Nov 28;19(23):5238. doi: 10.3390/s19235238.