Rydén Fredrik, Chizeck Howard Jay
IEEE Trans Haptics. 2013 Jul-Sep;6(3):257-67. doi: 10.1109/TOH.2013.20.
This paper presents a new haptic rendering method for streaming point cloud data. It provides haptic rendering of moving physical objects using data obtained from RGB-D cameras. Thus, real-time haptic interaction with moving objects can be achieved using noncontact sensors. This method extends "virtual coupling"-based proxy methods in a way that does not require preprocessing of points and allows for spatial point cloud discontinuities. The key ideas of the algorithm are iterative motion of the proxy with respect to the points, and the use of a variable proxy step size that results in better accuracy for short proxy movements and faster convergence for longer movements. This method provides highly accurate haptic interaction for geometries in which the proxy can physically fit. Another advantage is a significant reduction in the risk of "pop through" during haptic interaction with dynamic point clouds, even in the presence of noise. This haptic rendering method is computationally efficient; it can run in real time on available personal computers without the need for downsampling of point clouds from commercially available depth cameras.
本文提出了一种用于流点云数据的新型触觉渲染方法。它利用从RGB-D相机获得的数据提供移动物理对象的触觉渲染。因此,使用非接触式传感器可以实现与移动物体的实时触觉交互。该方法以一种不需要对点进行预处理且允许空间点云不连续的方式扩展了基于“虚拟耦合”的代理方法。该算法的关键思想是代理相对于点的迭代运动,以及使用可变代理步长,这使得短代理运动具有更高的精度,长运动具有更快的收敛速度。该方法为代理可以物理拟合的几何形状提供了高度精确的触觉交互。另一个优点是,即使在存在噪声的情况下,与动态点云进行触觉交互时“穿透”风险也会显著降低。这种触觉渲染方法计算效率高;它可以在现有的个人计算机上实时运行,而无需对来自商用深度相机的点云进行下采样。