Fuchs Stefan, Belardinelli Anna
Honda Research Institute Europe, Offenbach, Germany.
Front Neurorobot. 2021 Apr 16;15:647930. doi: 10.3389/fnbot.2021.647930. eCollection 2021.
Shared autonomy aims at combining robotic and human control in the execution of remote, teleoperated tasks. This cooperative interaction cannot be brought about without the robot first recognizing the current human intention in a fast and reliable way so that a suitable assisting plan can be quickly instantiated and executed. Eye movements have long been known to be highly predictive of the cognitive agenda unfolding during manual tasks and constitute, hence, the earliest and most reliable behavioral cues for intention estimation. In this study, we present an experiment aimed at analyzing human behavior in simple teleoperated pick-and-place tasks in a simulated scenario and at devising a suitable model for early estimation of the current proximal intention. We show that scan paths are, as expected, heavily shaped by the current intention and that two types of Gaussian Hidden Markov Models, one more scene-specific and one more action-specific, achieve a very good prediction performance, while also generalizing to new users and spatial arrangements. We finally discuss how behavioral and model results suggest that eye movements reflect to some extent the invariance and generality of higher-level planning across object configurations, which can be leveraged by cooperative robotic systems.
共享自主性旨在将机器人控制与人类控制相结合,以执行远程遥操作任务。如果机器人不能首先以快速且可靠的方式识别当前人类意图,就无法实现这种协作交互,以便能够迅速实例化并执行合适的辅助计划。长期以来,人们都知道眼球运动能高度预测手动任务中展开的认知议程,因此,它构成了意图估计的最早且最可靠的行为线索。在本研究中,我们展示了一项实验,旨在分析模拟场景中简单遥操作拾取和放置任务中的人类行为,并设计一个合适的模型,用于早期估计当前的近端意图。我们表明,正如预期的那样,扫视路径在很大程度上受当前意图的影响,并且两种类型的高斯隐马尔可夫模型,一种更针对场景,另一种更针对动作,都能实现非常好的预测性能,同时还能推广到新用户和空间布局。我们最后讨论了行为和模型结果如何表明眼球运动在一定程度上反映了跨对象配置的高级规划的不变性和普遍性,协作机器人系统可以利用这一点。