Jiaxing Nanhu University, Jiaxing, 314001, Zhejiang Province, China.
Hangzhou Dianzi University, Hangzhou, 310018, Zhejiang Province, China.
Sci Rep. 2022 Nov 1;12(1):18356. doi: 10.1038/s41598-022-21734-y.
Virtual fitting can bring fast and convenient fitting experience for people. The two core problems of virtual fitting system are human-computer interaction and clothing simulation. Interaction is an important factor in determining the fitting experience. The previous virtual fitting products usually use the mouse and keyboard interaction, and users rarely have a good sense of substitution and interaction. While, the method of using multiple cameras to take user images from different angles and then carry out posture recognition has the defect of low recognition accuracy. In view of clothing simulation and human-computer interaction of virtual fitting system, in order to achieve better customer immersion experience, this paper implemented a real-time interactive virtual fitting system based on Microsoft Kinect motion sensing device, and proposed a gesture determination algorithm based on finger recognition and an image transfer algorithm based on skeleton information matching. Using OpenNI development library and multi-threading technology, we have developed a motion-sensing capture module and a complete real-time virtual fitting system, and the system test results show that it has a good user experience.
虚拟试穿可以为人们带来快速便捷的试穿体验。虚拟试穿系统的两个核心问题是人机交互和服装模拟。交互是决定试穿体验的重要因素。以前的虚拟试穿产品通常使用鼠标和键盘交互,用户很少有很好的替代感和交互感。而使用多个摄像头从不同角度拍摄用户图像然后进行姿势识别的方法存在识别精度低的缺陷。针对虚拟试穿系统的服装模拟和人机交互问题,为了实现更好的客户沉浸体验,本文基于 Microsoft Kinect 运动感应设备实现了一个实时交互虚拟试穿系统,并提出了一种基于手指识别的手势确定算法和一种基于骨骼信息匹配的图像传输算法。使用 OpenNI 开发库和多线程技术,我们开发了一个运动感应捕捉模块和一个完整的实时虚拟试穿系统,系统测试结果表明,它具有良好的用户体验。