Zhao Zhou, Zheng Dongyuan, Chen Lu
School of Computer Science, Central China Normal University, Wuhan 430079, China.
Hubei Engineering Research Center for Intelligent Detection and Identification of Complex Parts, Wuhan 430079, China.
Sensors (Basel). 2024 Aug 5;24(15):5080. doi: 10.3390/s24155080.
Robots execute diverse load operations, including carrying, lifting, tilting, and moving objects, involving load changes or transfers. This dynamic process can result in the shift of interactive operations from stability to instability. In this paper, we respond to these dynamic changes by utilizing tactile images captured from tactile sensors during interactions, conducting a study on the dynamic stability and instability in operations, and propose a real-time dynamic state sensing network by integrating convolutional neural networks (CNNs) for spatial feature extraction and long short-term memory (LSTM) networks to capture temporal information. We collect a dataset capturing the entire transition from stable to unstable states during interaction. Employing a sliding window, we sample consecutive frames from the collected dataset and feed them into the network for the state change predictions of robots. The network achieves both real-time temporal sequence prediction at 31.84 ms per inference step and an average classification accuracy of 98.90%. Our experiments demonstrate the network's robustness, maintaining high accuracy even with previously unseen objects.
机器人执行各种负载操作,包括搬运、提升、倾斜和移动物体,涉及负载变化或转移。这种动态过程可能导致交互操作从稳定状态转变为不稳定状态。在本文中,我们通过利用交互过程中从触觉传感器捕获的触觉图像来应对这些动态变化,对操作中的动态稳定性和不稳定性进行研究,并通过集成用于空间特征提取的卷积神经网络(CNN)和用于捕获时间信息的长短期记忆(LSTM)网络,提出了一种实时动态状态传感网络。我们收集了一个数据集,该数据集捕获了交互过程中从稳定状态到不稳定状态的整个转变过程。我们采用滑动窗口,从收集的数据集中采样连续的帧,并将它们输入网络以进行机器人状态变化预测。该网络在每个推理步骤31.84毫秒的情况下实现了实时时间序列预测,平均分类准确率达到98.90%。我们的实验证明了该网络的鲁棒性,即使对于以前未见过的物体也能保持高精度。