Sun Huanbo, Martius Georg
Autonomous Learning Group, Max Planck Institute for Intelligent Systems, Tübingen, Germany.
Front Neurorobot. 2019 Jul 10;13:51. doi: 10.3389/fnbot.2019.00051. eCollection 2019.
Robust haptic sensation systems are essential for obtaining dexterous robots. Currently, we have solutions for small surface areas, such as fingers, but affordable and robust techniques for covering large areas of an arbitrary 3D surface are still missing. Here, we introduce a general machine learning framework to infer multi-contact haptic forces on a 3D robot's limb surface from internal deformation measured by only a few physical sensors. The general idea of this framework is to predict first the whole surface deformation pattern from the sparsely placed sensors and then to infer number, locations, and force magnitudes of unknown contact points. We show how this can be done even if training data can only be obtained for single-contact points using transfer learning at the example of a modified limb of the Poppy robot. With only 10 strain-gauge sensors we obtain a high accuracy also for multiple-contact points. The method can be applied to arbitrarily shaped surfaces and physical sensor types, as long as training data can be obtained.
强大的触觉感知系统对于实现灵巧机器人至关重要。目前,我们有针对小表面积(如手指)的解决方案,但仍缺少用于覆盖任意三维表面大面积区域的经济实惠且强大的技术。在此,我们引入一个通用的机器学习框架,以便仅根据少数物理传感器测量的内部变形来推断三维机器人肢体表面上的多接触触觉力。该框架的总体思路是首先从稀疏放置的传感器预测整个表面变形模式,然后推断未知接触点的数量、位置和力的大小。我们展示了即使只能使用迁移学习针对单接触点获取训练数据,在Poppy机器人的改良肢体示例中也能做到这一点。仅使用10个应变片传感器,我们对于多接触点也能获得高精度。只要能够获取训练数据,该方法就可以应用于任意形状的表面和物理传感器类型。