Li Xin, Liu Jinkang, Huang Yijing, Wang Donghao, Miao Yang
School of Mechanical and Materials Engineering, North China University of Technology, Beijing 100144, China.
Faculty of Materials and Manufacturing, Beijing University of Technology, Beijing 100124, China.
Micromachines (Basel). 2022 Jul 29;13(8):1205. doi: 10.3390/mi13081205.
An exoskeleton is a kind of intelligent wearable device with bioelectronics and biomechanics. To realize its effective assistance to the human body, an exoskeleton needs to recognize the real time movement pattern of the human body in order to make corresponding movements at the right time. However, it is of great difficulty for an exoskeleton to fully identify human motion patterns, which are mainly manifested as incomplete acquisition of lower limb motion information, poor feature extraction ability, and complicated steps. Aiming at the above consideration, the motion mechanisms of human lower limbs have been analyzed in this paper, and a set of wearable bioelectronics devices are introduced based on an electromyography (EMG) sensor and inertial measurement unit (IMU), which help to obtain biological and kinematic information of the lower limb. Then, the Dual Stream convolutional neural network (CNN)-ReliefF was presented to extract features from the fusion sensors' data, which were input into four different classifiers to obtain the recognition accuracy of human motion patterns. Compared with a single sensor (EMG or IMU) and single stream CNN or manual designed feature extraction methods, the feature extraction based on Dual Stream CNN-ReliefF shows better performance in terms of visualization performance and recognition accuracy. This method was used to extract features from EMG and IMU data of six subjects and input these features into four different classifiers. The motion pattern recognition accuracy of each subject under the four classifiers is above 97%, with the highest average recognition accuracy reaching 99.12%. It can be concluded that the wearable bioelectronics device and Dual Stream CNN-ReliefF feature extraction method proposed in this paper enhanced an exoskeleton's ability to capture human movement patterns, thus providing optimal assistance to the human body at the appropriate time. Therefore, it can provide a novel approach for improving the human-machine interaction of exoskeletons.
外骨骼是一种融合生物电子学和生物力学的智能可穿戴设备。为了实现其对人体的有效辅助,外骨骼需要识别人体的实时运动模式,以便在正确的时间做出相应动作。然而,外骨骼要完全识别人类运动模式具有很大难度,主要表现为下肢运动信息获取不完整、特征提取能力差以及步骤复杂。针对上述考虑,本文分析了人类下肢的运动机制,并引入了一套基于肌电图(EMG)传感器和惯性测量单元(IMU)的可穿戴生物电子设备,有助于获取下肢的生物和运动学信息。然后,提出了双流卷积神经网络(CNN)-ReliefF算法从融合传感器数据中提取特征,并将其输入到四种不同的分类器中以获得人类运动模式的识别准确率。与单传感器(EMG或IMU)、单流CNN或人工设计的特征提取方法相比,基于双流CNN-ReliefF的特征提取在可视化性能和识别准确率方面表现更好。该方法用于从六名受试者的EMG和IMU数据中提取特征,并将这些特征输入到四种不同的分类器中。四个分类器下各受试者的运动模式识别准确率均高于97%,最高平均识别准确率达到99.12%。可以得出结论,本文提出的可穿戴生物电子设备和双流CNN-ReliefF特征提取方法增强了外骨骼捕捉人类运动模式的能力,从而在适当的时候为人体提供最佳辅助。因此,它可以为改善外骨骼的人机交互提供一种新方法。