IEEE Trans Cybern. 2022 Oct;52(10):10000-10013. doi: 10.1109/TCYB.2021.3062750. Epub 2022 Sep 19.
Thanks to large-scale labeled training data, deep neural networks (DNNs) have obtained remarkable success in many vision and multimedia tasks. However, because of the presence of domain shift, the learned knowledge of the well-trained DNNs cannot be well generalized to new domains or datasets that have few labels. Unsupervised domain adaptation (UDA) studies the problem of transferring models trained on one labeled source domain to another unlabeled target domain. In this article, we focus on UDA in visual emotion analysis for both emotion distribution learning and dominant emotion classification. Specifically, we design a novel end-to-end cycle-consistent adversarial model, called CycleEmotionGAN++. First, we generate an adapted domain to align the source and target domains on the pixel level by improving CycleGAN with a multiscale structured cycle-consistency loss. During the image translation, we propose a dynamic emotional semantic consistency loss to preserve the emotion labels of the source images. Second, we train a transferable task classifier on the adapted domain with feature-level alignment between the adapted and target domains. We conduct extensive UDA experiments on the Flickr-LDL and Twitter-LDL datasets for distribution learning and ArtPhoto and Flickr and Instagram datasets for emotion classification. The results demonstrate the significant improvements yielded by the proposed CycleEmotionGAN++ compared to state-of-the-art UDA approaches.
得益于大规模的标注训练数据,深度神经网络(DNN)在许多视觉和多媒体任务中取得了显著的成功。然而,由于存在领域迁移,经过良好训练的 DNN 所学到的知识不能很好地推广到新的领域或仅有少量标签的数据集。无监督领域自适应(UDA)研究的是将在一个有标注的源域上训练的模型转移到另一个无标注的目标域的问题。在本文中,我们专注于视觉情感分析中的 UDA,包括情感分布学习和主导情感分类。具体来说,我们设计了一种新颖的端到端循环一致性对抗模型,称为 CycleEmotionGAN++。首先,我们通过改进 CycleGAN 并引入多尺度结构化循环一致性损失,在像素级别生成一个适配域,以对齐源域和目标域。在图像翻译过程中,我们提出了动态情感语义一致性损失,以保留源图像的情感标签。其次,我们在适配域上训练一个可转移的任务分类器,在适配域和目标域之间实现特征级别的对齐。我们在 Flickr-LDL 和 Twitter-LDL 数据集上进行了广泛的 UDA 实验,用于分布学习,以及在 ArtPhoto、Flickr 和 Instagram 数据集上进行了情感分类实验。结果表明,与最先进的 UDA 方法相比,所提出的 CycleEmotionGAN++ 取得了显著的改进。