Dani Ashwin P, Salehi Iman, Rotithor Ghananeel, Trombetta Daniel, Ravichandar Harish
University of Connecticut, Storrs.
School of Interactive Computing, Georgia Institute of Technology.
IEEE Control Syst. 2020 Dec;40(6):29-56. Epub 2020 Nov 16.
This article provides a perspective on estimation and control problems in cyberphysical human systems (CPHSs) that work at the intersection of cyberphysical systems and human systems. The article also discusses solutions to some of the problems in CPHSs. One example of a CPHS is a close-proximity human-robot collaboration (HRC) in a manufacturing setting. The issue of the joint operation's efficiency and human factors, such as safety, attention, mental states, and comfort, naturally arise in the HRC context. By considering human factors, robots' actions can be controlled to achieve objectives, including safe operations and human comfort. Alternately, questions arise when robot factors are considered. For example, can we provide direct inputs and information to humans about an environment and the robots in the area such that the objectives of safety, efficiency, and comfort can be satisfied by considering the robots' current capabilities? The article discusses specific problems involved in HRC related to controlling a robot's motion by taking the current actions of the human in the loop with the robot's control system. To this end, two main challenges are discussed: 1) inferring the intention behind human actions by analyzing a person's motion as observed through skeletal tracking and gaze data and 2) a controller design that keeps robot motion constrained to a boundary in a 3D space by using control barrier functions. The intention inference method fuses skeleton-joint tracking data obtained using the Microsoft Kinect sensor and human gaze data gathered from red-green-blue Kinect images. The direction of a human's hand-reaching motion and a goal-reaching point is estimated while performing a joint pick-and-place task. The trajectory of the hand is estimated forward in time based on the gaze and hand motion data at the current time instance. A barrier function method is applied to generate safe robot trajectories along with forecast hand movements to complete the collaborative displacement of an object by a person and a robot. An adaptive controller is then used to track the reference trajectories using the Baxter robot, which is tested in a Gazebo simulation environment.
本文提供了一个关于信息物理人类系统(CPHS)中估计和控制问题的视角,这些系统运行于信息物理系统和人类系统的交叉领域。本文还讨论了CPHS中一些问题的解决方案。CPHS的一个例子是制造环境中的近距离人机协作(HRC)。在HRC场景中,自然会出现联合操作效率以及安全、注意力、心理状态和舒适度等人为因素的问题。通过考虑人为因素,可以控制机器人的动作以实现包括安全操作和人类舒适度在内的目标。反之,当考虑机器人因素时也会出现问题。例如,我们能否向人类提供有关环境和该区域内机器人的直接输入和信息,以便通过考虑机器人的当前能力来满足安全、效率和舒适度目标?本文讨论了与通过将人类在回路中的当前动作与机器人控制系统相结合来控制机器人运动相关的HRC中涉及的具体问题。为此,讨论了两个主要挑战:1)通过分析通过骨骼跟踪和注视数据观察到的人的运动来推断人类动作背后的意图,以及2)使用控制障碍函数将机器人运动限制在三维空间边界内的控制器设计。意图推理方法融合了使用微软Kinect传感器获得的骨骼关节跟踪数据和从红-绿-蓝Kinect图像收集的人类注视数据。在执行联合抓取和放置任务时,估计人类伸手动作的方向和目标到达点。基于当前时刻的注视和手部运动数据,向前估计手部的轨迹。应用障碍函数方法生成安全的机器人轨迹以及预测的手部运动,以完成人和机器人对物体的协同位移。然后使用自适应控制器通过Baxter机器人跟踪参考轨迹,并在Gazebo仿真环境中进行测试。