Gutierrez Kenneth, Santos Veronica J
IEEE Trans Haptics. 2020 Oct-Dec;13(4):831-839. doi: 10.1109/TOH.2020.2975555. Epub 2020 Dec 25.
Humans can perceive tactile directionality with angular perception thresholds of 14-40° via fingerpad skin displacement. Using deformable, artificial tactile sensors, the ability to perceive tactile directionality was developed for a robotic system to aid in object manipulation tasks. Two convolutional neural networks (CNNs) were trained on tactile images created from fingerpad deformation measurements during perturbations to a handheld object. A primary CNN regression model provided a point estimate of tactile directionality over a range of grip forces, perturbation angles, and perturbation speeds. A secondary CNN model provided a variance estimate that was used to determine uncertainty about the point estimate. A 5-fold cross-validation was performed to evaluate model performance. The primary CNN produced tactile directionality point estimates with an error rate of 4.3% for a 20° angular resolution and was benchmarked against an open-source force estimation network. The model was implemented in real-time for interactions with an external agent and the environment with different object shapes and widths. The perception of tactile directionality could be used to enhance the situational awareness of human operators of telerobotic systems and to develop decision-making algorithms for context-appropriate responses by semi-autonomous robots.
人类能够通过指尖皮肤位移以14 - 40°的角度感知阈值来感知触觉方向性。利用可变形的人工触觉传感器,为机器人系统开发了感知触觉方向性的能力,以辅助物体操作任务。在对手持物体进行扰动期间,根据指尖变形测量创建的触觉图像训练了两个卷积神经网络(CNN)。一个主要的CNN回归模型在一系列握力、扰动角度和扰动速度范围内提供触觉方向性的点估计。一个次要的CNN模型提供方差估计,用于确定点估计的不确定性。进行了5折交叉验证以评估模型性能。主要的CNN在20°角分辨率下产生的触觉方向性点估计的错误率为4.3%,并与一个开源力估计网络进行了基准测试。该模型实时实现,用于与具有不同物体形状和宽度的外部代理及环境进行交互。触觉方向性的感知可用于增强远程机器人系统人类操作员的态势感知,并为半自主机器人开发适用于上下文的响应决策算法。