Institute of Information Photonics and Optical Communications, Beijing University of Posts and Telecommunications, Beijing 100876, China.
Key Laboratory of the Ministry of Education for Optoelectronic Measurement Technology and Instrument, Beijing Information Science and Technology University, Beijing 100192, China.
Sensors (Basel). 2018 Nov 15;18(11):3949. doi: 10.3390/s18113949.
An extended robot⁻world and hand⁻eye calibration method is proposed in this paper to evaluate the transformation relationship between the camera and robot device. This approach could be performed for mobile or medical robotics applications, where precise, expensive, or unsterile calibration objects, or enough movement space, cannot be made available at the work site. Firstly, a mathematical model is established to formulate the robot-gripper-to-camera rigid transformation and robot-base-to-world rigid transformation using the Kronecker product. Subsequently, a sparse bundle adjustment is introduced for the optimization of robot⁻world and hand⁻eye calibration, as well as reconstruction results. Finally, a validation experiment including two kinds of real data sets is designed to demonstrate the effectiveness and accuracy of the proposed approach. The translation relative error of rigid transformation is less than 8/10,000 by a Denso robot in a movement range of 1.3 m × 1.3 m × 1.2 m. The distance measurement mean error after three-dimensional reconstruction is 0.13 mm.
本文提出了一种扩展的机器人-世界和手-眼标定方法,用于评估相机和机器人设备之间的变换关系。这种方法可用于移动或医疗机器人应用,在这些应用中,无法在工作现场提供精确、昂贵或不无菌的标定物体,或足够的运动空间。首先,建立了一个数学模型,使用 Kronecker 积来公式化机器人夹持器到相机的刚体变换和机器人基座到世界的刚体变换。随后,引入稀疏捆绑调整进行机器人-世界和手-眼标定以及重建结果的优化。最后,设计了一个验证实验,包括两种真实数据集,以证明所提出方法的有效性和准确性。在 1.3 m×1.3 m×1.2 m 的运动范围内,Denso 机器人的刚体变换的平移相对误差小于 8/10000。三维重建后的距离测量平均误差为 0.13 mm。