Solah Michael, Huang Haikun, Sheng Jiachuan, Feng Tian, Pomplun Marc, Yu Lap-Fai
IEEE Trans Vis Comput Graph. 2022 May;28(5):2058-2068. doi: 10.1109/TVCG.2022.3150513. Epub 2022 Apr 11.
One of the challenging tasks in virtual scene design for Virtual Reality (VR) is causing it to invoke a particular mood in viewers. The subjective nature of moods brings uncertainty to the purpose. We propose a novel approach to automatic adjustment of the colors of textures for objects in a virtual indoor scene, enabling it to match a target mood. A dataset of 25,000 images, including building/home interiors, was used to train a classifier with the features extracted via deep learning. It contributes to an optimization process that colorizes virtual scenes automatically according to the target mood. Our approach was tested on four different indoor scenes, and we conducted a user study demonstrating its efficacy through statistical analysis with the focus on the impact of the scenes experienced with a VR headset.
虚拟现实(VR)虚拟场景设计中的一项具有挑战性的任务是使其在观众中唤起特定的情绪。情绪的主观性给这一目标带来了不确定性。我们提出了一种新颖的方法来自动调整虚拟室内场景中物体纹理的颜色,使其与目标情绪相匹配。一个包含25000张图像(包括建筑/家居内部)的数据集被用于训练一个通过深度学习提取特征的分类器。它有助于一个根据目标情绪自动为虚拟场景上色的优化过程。我们的方法在四个不同的室内场景上进行了测试,并且我们进行了一项用户研究,通过统计分析证明了其有效性,重点关注使用VR头显体验场景的影响。