Suppr超能文献

迈向基于触觉的双臂操作。

Towards Haptic-Based Dual-Arm Manipulation.

机构信息

School of Mechanical and Aerospace Engineering, Nanyang Technological University, Singapore 639798, Singapore.

出版信息

Sensors (Basel). 2022 Dec 29;23(1):376. doi: 10.3390/s23010376.

Abstract

Vision is the main component of current robotics systems that is used for manipulating objects. However, solely relying on vision for hand-object pose tracking faces challenges such as occlusions and objects moving out of view during robotic manipulation. In this work, we show that object kinematics can be inferred from local haptic feedback at the robot-object contact points, combined with robot kinematics information given an initial vision estimate of the object pose. A planar, dual-arm, teleoperated robotic setup was built to manipulate an object with hands shaped like circular discs. The robot hands were built with rubber cladding to allow for rolling contact without slipping. During stable grasping by the dual arm robot, under quasi-static conditions, the surface of the robot hand and object at the contact interface is defined by local geometric constraints. This allows one to define a relation between object orientation and robot hand orientation. With rolling contact, the displacement of the contact point on the object surface and the hand surface must be equal and opposite. This information, coupled with robot kinematics, allows one to compute the displacement of the object from its initial location. The mathematical formulation of the geometric constraints between robot hand and object is detailed. This is followed by the methodology in acquiring data from experiments to compute object kinematics. The sensors used in the experiments, along with calibration procedures, are presented before computing the object kinematics from recorded haptic feedback. Results comparing object kinematics obtained purely from vision and from haptics are presented to validate our method, along with the future ideas for perception via haptic manipulation.

摘要

视觉是当前机器人系统的主要组成部分,用于操纵物体。然而,仅依靠视觉进行手-物姿态跟踪会面临挑战,例如在机器人操作过程中物体被遮挡或移出视野。在这项工作中,我们展示了可以从机器人-物体接触点的局部触觉反馈中推断出物体运动学,结合机器人运动学信息,给出物体姿态的初始视觉估计。建立了一个平面、双臂、遥控机器人设置,以双手形状为圆形的圆盘来操纵物体。机器人手用橡胶涂层制成,以允许滚动接触而不滑动。在双机器人稳定抓取、准静态条件下,机器人手和物体在接触界面处的表面由局部几何约束定义。这允许定义物体方向和机器人手方向之间的关系。有滚动接触时,物体表面和手表面上接触点的位移必须相等且相反。该信息与机器人运动学相结合,可以计算物体从初始位置的位移。详细介绍了机器人手和物体之间的几何约束的数学公式。然后介绍了从实验中获取数据以计算物体运动学的方法。在从记录的触觉反馈计算物体运动学之前,介绍了实验中使用的传感器以及校准程序。展示了仅从视觉和触觉获得的物体运动学的结果,以验证我们的方法,以及通过触觉操作进行感知的未来想法。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1240/9823935/8a0215370575/sensors-23-00376-g0A1.jpg

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验