IEEE Trans Haptics. 2021 Jan-Mar;14(1):32-43. doi: 10.1109/TOH.2020.3004637. Epub 2021 Mar 24.
Encountered-type haptic rendering provides realistic, free-to-touch, and move-and-collide haptic sensation to a user. However, inducing haptic-texture sensation without complicated tactile actuators is challenging for encountered-type haptic rendering. In this article, we propose a novel texture synthesizing method for an encountered-type haptic display using spatial and temporal encoding of roughness, which provides both active and passive touch sensation requiring no complicated tactile actuation. Focused on macro-scale roughness perception, we geometrically model the textured surface with a grid of hemiellipsoidal bumps, which can provide a variety of perceived roughness as the user explores the surface with one's bare hand. Our texture synthesis method is based on two important hypotheses. First, we assume that perceptual roughness can be spatially encoded along the radial direction of a textured surface with hemiellipsoidal bumps. Second, perceptual roughness temporally varies with the relative velocity of a scanning human hand with respect to the surface. To validate these hypotheses on our spatiotemporal encoding method, we implemented an encountered-type haptic texture rendering system using an off-the-shelf collaborative robot that can also track the user's hand using IR sensors. We performed psychophysical user tests with 25 participants and verified the main effects of spatiotemporal encoding of a textured model on the user's roughness perception. Our empirical experiments imply that the users perceive a more rough texture as the surface orientation or the relative hand motion increases. Based on these findings, we show that our visuo-haptic system can synthesize an appropriate level of roughness corresponding to diverse visual textures by suitably choosing encoding values.
遭遇型触觉呈现为用户提供真实的、可自由触摸的、可移动和碰撞的触觉感受。然而,对于遭遇型触觉呈现来说,在没有复杂触觉执行器的情况下产生触觉纹理感觉是具有挑战性的。在本文中,我们提出了一种使用粗糙度的空间和时间编码的新型遭遇型触觉显示的纹理合成方法,它提供了主动和被动触觉感觉,而不需要复杂的触觉执行器。本文重点关注宏观尺度的粗糙度感知,我们使用半球形凸起的网格对纹理表面进行几何建模,当用户用手探索表面时,这些凸起可以提供各种感知粗糙度。我们的纹理合成方法基于两个重要的假设。首先,我们假设可以通过具有半球形凸起的纹理表面的径向方向对感知粗糙度进行空间编码。其次,感知粗糙度会随人手相对于表面的扫描相对速度而随时间变化。为了验证我们的时空编码方法的这些假设,我们使用了一款现成的协作机器人实现了一种遭遇型触觉纹理呈现系统,该系统还可以使用红外传感器跟踪用户的手。我们对 25 名参与者进行了心理物理用户测试,并验证了纹理模型的时空编码对用户粗糙度感知的主要影响。我们的实证实验表明,随着表面方向或相对手部运动的增加,用户会感觉到更粗糙的纹理。基于这些发现,我们展示了我们的视触觉系统可以通过适当选择编码值来合成与各种视觉纹理相对应的适当粗糙度水平。