Suppr超能文献

下肢外骨骼机器人基于视觉的环境感知综述

Review of Vision-Based Environmental Perception for Lower-Limb Exoskeleton Robots.

作者信息

Wang Chen, Pei Zhongcai, Fan Yanan, Qiu Shuang, Tang Zhiyong

机构信息

School of Automation Science and Electrical Engineering, Beihang University, Beijing 100191, China.

出版信息

Biomimetics (Basel). 2024 Apr 22;9(4):254. doi: 10.3390/biomimetics9040254.

Abstract

The exoskeleton robot is a wearable electromechanical device inspired by animal exoskeletons. It combines technologies such as sensing, control, information, and mobile computing, enhancing human physical abilities and assisting in rehabilitation training. In recent years, with the development of visual sensors and deep learning, the environmental perception of exoskeletons has drawn widespread attention in the industry. Environmental perception can provide exoskeletons with a certain level of autonomous perception and decision-making ability, enhance their stability and safety in complex environments, and improve the human-machine-environment interaction loop. This paper provides a review of environmental perception and its related technologies of lower-limb exoskeleton robots. First, we briefly introduce the visual sensors and control system. Second, we analyze and summarize the key technologies of environmental perception, including related datasets, detection of critical terrains, and environment-oriented adaptive gait planning. Finally, we analyze the current factors limiting the development of exoskeleton environmental perception and propose future directions.

摘要

外骨骼机器人是一种受动物外骨骼启发的可穿戴机电设备。它融合了传感、控制、信息和移动计算等技术,增强了人类的身体能力并辅助康复训练。近年来,随着视觉传感器和深度学习的发展,外骨骼的环境感知在行业中受到了广泛关注。环境感知可以为外骨骼提供一定程度的自主感知和决策能力,增强其在复杂环境中的稳定性和安全性,并改善人机环境交互循环。本文对外骨骼机器人的环境感知及其相关技术进行了综述。首先,我们简要介绍视觉传感器和控制系统。其次,我们分析和总结环境感知的关键技术,包括相关数据集、关键地形检测和面向环境的自适应步态规划。最后,我们分析了当前限制外骨骼环境感知发展的因素,并提出了未来的发展方向。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c08a/11048416/6a2cd8c1c13a/biomimetics-09-00254-g001.jpg

相似文献

1
Review of Vision-Based Environmental Perception for Lower-Limb Exoskeleton Robots.
Biomimetics (Basel). 2024 Apr 22;9(4):254. doi: 10.3390/biomimetics9040254.
2
Review of adaptive control for stroke lower limb exoskeleton rehabilitation robot based on motion intention recognition.
Front Neurorobot. 2023 Jul 3;17:1186175. doi: 10.3389/fnbot.2023.1186175. eCollection 2023.
3
Lower Limb Exoskeleton Sensors: State-of-the-Art.
Sensors (Basel). 2022 Nov 23;22(23):9091. doi: 10.3390/s22239091.
4
Gait Recognition for Lower Limb Exoskeletons Based on Interactive Information Fusion.
Appl Bionics Biomech. 2022 Mar 26;2022:9933018. doi: 10.1155/2022/9933018. eCollection 2022.
5
Exoskeleton robots for lower limb assistance: A review of materials, actuation, and manufacturing methods.
Proc Inst Mech Eng H. 2021 Dec;235(12):1375-1385. doi: 10.1177/09544119211032010. Epub 2021 Jul 13.
6
Systematic review on wearable lower-limb exoskeletons for gait training in neuromuscular impairments.
J Neuroeng Rehabil. 2021 Feb 1;18(1):22. doi: 10.1186/s12984-021-00815-5.
7
A Real-Time Stability Control Method Through sEMG Interface for Lower Extremity Rehabilitation Exoskeletons.
Front Neurosci. 2021 Apr 13;15:645374. doi: 10.3389/fnins.2021.645374. eCollection 2021.
8
Wearable rehabilitation exoskeletons of the lower limb: analysis of versatility and adaptability.
Disabil Rehabil Assist Technol. 2023 May;18(4):392-406. doi: 10.1080/17483107.2020.1858976. Epub 2020 Dec 17.
10
A Wearable Lower Limb Exoskeleton: Reducing the Energy Cost of Human Movement.
Micromachines (Basel). 2022 Jun 6;13(6):900. doi: 10.3390/mi13060900.

引用本文的文献

本文引用的文献

1
Lower Limb Activity Recognition Based on sEMG Using Stacked Weighted Random Forest.
IEEE Trans Neural Syst Rehabil Eng. 2024;32:166-177. doi: 10.1109/TNSRE.2023.3346462. Epub 2024 Jan 15.
2
RGB-D-Based Stair Detection and Estimation Using Deep Learning.
Sensors (Basel). 2023 Feb 15;23(4):2175. doi: 10.3390/s23042175.
3
Stair Recognition for Robotic Exoskeleton Control using Computer Vision and Deep Learning.
IEEE Int Conf Rehabil Robot. 2022 Jul;2022:1-6. doi: 10.1109/ICORR55369.2022.9896501.
4
Deep leaning-based ultra-fast stair detection.
Sci Rep. 2022 Sep 27;12(1):16124. doi: 10.1038/s41598-022-20667-w.
5
Environment Classification for Robotic Leg Prostheses and Exoskeletons Using Deep Convolutional Neural Networks.
Front Neurorobot. 2022 Feb 4;15:730965. doi: 10.3389/fnbot.2021.730965. eCollection 2021.
6
Computer Vision and Deep Learning for Environment-Adaptive Control of Robotic Lower-Limb Exoskeletons.
Annu Int Conf IEEE Eng Med Biol Soc. 2021 Nov;2021:4631-4635. doi: 10.1109/EMBC46164.2021.9630064.
7
Deep Hough Transform for Semantic Line Detection.
IEEE Trans Pattern Anal Mach Intell. 2022 Sep;44(9):4793-4806. doi: 10.1109/TPAMI.2021.3077129. Epub 2022 Aug 4.
8
ExoNet Database: Wearable Camera Images of Human Locomotion Environments.
Front Robot AI. 2020 Dec 3;7:562061. doi: 10.3389/frobt.2020.562061. eCollection 2020.
9
Polylidar3D-Fast Polygon Extraction from 3D Data.
Sensors (Basel). 2020 Aug 26;20(17):4819. doi: 10.3390/s20174819.
10
Obstacle Recognition using Computer Vision and Convolutional Neural Networks for Powered Prosthetic Leg Applications.
Annu Int Conf IEEE Eng Med Biol Soc. 2019 Jul;2019:3360-3363. doi: 10.1109/EMBC.2019.8857420.

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验