School of Electronic Science and Engineering, Nanjing University, Nanjing 210046, China.
Sensors (Basel). 2013 Aug 26;13(9):11362-84. doi: 10.3390/s130911362.
Non-contact human body measurement plays an important role in surveillance, physical healthcare, on-line business and virtual fitting. Current methods for measuring the human body without physical contact usually cannot handle humans wearing clothes, which limits their applicability in public environments. In this paper, we propose an effective solution that can measure accurate parameters of the human body with large-scale motion from a Kinect sensor, assuming that the people are wearing clothes. Because motion can drive clothes attached to the human body loosely or tightly, we adopt a space-time analysis to mine the information across the posture variations. Using this information, we recover the human body, regardless of the effect of clothes, and measure the human body parameters accurately. Experimental results show that our system can perform more accurate parameter estimation on the human body than state-of-the-art methods.
非接触式人体测量在监控、物理保健、在线业务和虚拟试衣等领域发挥着重要作用。目前无需物理接触即可测量人体的方法通常无法处理穿着衣服的人,这限制了它们在公共环境中的适用性。在本文中,我们提出了一种有效的解决方案,可以从 Kinect 传感器测量穿着衣服的人体的大运动的精确参数。由于运动可以使附着在人体上的衣服变松或变紧,因此我们采用时空分析来挖掘跨姿势变化的信息。利用这些信息,我们可以恢复人体,而不受衣服的影响,并准确测量人体参数。实验结果表明,我们的系统可以比最先进的方法更准确地估计人体参数。