Suppr超能文献

基于一致性注意学习的孪生网络的行人再识别。

Uniformity Attentive Learning-Based Siamese Network for Person Re-Identification.

机构信息

Department of Image, Graduate School of Advanced Imaging Science, Multimedia and Film, Chung-Ang University, Seoul 06974, Korea.

出版信息

Sensors (Basel). 2020 Jun 26;20(12):3603. doi: 10.3390/s20123603.

Abstract

Person re-identification (Re-ID) has a problem that makes learning difficult such as misalignment and occlusion. To solve these problems, it is important to focus on robust features in intra-class variation. Existing attention-based Re-ID methods focus only on common features without considering distinctive features. In this paper, we present a novel attentive learning-based Siamese network for person Re-ID. Unlike existing methods, we designed an attention module and attention loss using the properties of the Siamese network to concentrate attention on common and distinctive features. The attention module consists of channel attention to select important channels and encoder-decoder attention to observe the whole body shape. We modified the triplet loss into an attention loss, called uniformity loss. The uniformity loss generates a unique attention map, which focuses on both common and discriminative features. Extensive experiments show that the proposed network compares favorably to the state-of-the-art methods on three large-scale benchmarks including Market-1501, CUHK03 and DukeMTMC-ReID datasets.

摘要

人体重识别(Re-ID)存在对齐和遮挡等问题,使得学习变得困难。为了解决这些问题,关注类内变化中的鲁棒特征非常重要。现有的基于注意力的 Re-ID 方法仅关注常见特征,而不考虑独特特征。在本文中,我们提出了一种新颖的基于注意的孪生网络用于人体 Re-ID。与现有方法不同,我们使用孪生网络的特性设计了一个注意力模块和注意力损失,以集中注意力于常见和独特特征。注意力模块由通道注意力组成,用于选择重要的通道,由编码器-解码器注意力组成,用于观察整个身体形状。我们将三元组损失修改为注意力损失,称为一致性损失。一致性损失生成一个独特的注意力图,同时关注常见和区分性特征。大量实验表明,所提出的网络在包括 Market-1501、CUHK03 和 DukeMTMC-ReID 数据集在内的三个大规模基准上的性能优于最先进的方法。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/07c3/7349100/a819df758fa4/sensors-20-03603-g001.jpg

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验