• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

基于部件的深度哈希用于大规模的行人再识别。

Part-Based Deep Hashing for Large-Scale Person Re-Identification.

出版信息

IEEE Trans Image Process. 2017 Oct;26(10):4806-4817. doi: 10.1109/TIP.2017.2695101. Epub 2017 Apr 18.

DOI:10.1109/TIP.2017.2695101
PMID:28436862
Abstract

Large-scale is a trend in person re-identi- fication (re-id). It is important that real-time search be performed in a large gallery. While previous methods mostly focus on discriminative learning, this paper makes the attempt in integrating deep learning and hashing into one framework to evaluate the efficiency and accuracy for large-scale person re-id. We integrate spatial information for discriminative visual representation by partitioning the pedestrian image into horizontal parts. Specifically, Part-based Deep Hashing (PDH) is proposed, in which batches of triplet samples are employed as the input of the deep hashing architecture. Each triplet sample contains two pedestrian images (or parts) with the same identity and one pedestrian image (or part) of the different identity. A triplet loss function is employed with a constraint that the Hamming distance of pedestrian images (or parts) with the same identity is smaller than ones with the different identity. In the experiment, we show that the proposed PDH method yields very competitive re-id accuracy on the large-scale Market-1501 and Market-1501+500K datasets.

摘要

大规模是人员再识别(re-id)的一个趋势。在大型图库中进行实时搜索很重要。虽然以前的方法主要集中在判别式学习上,但本文尝试将深度学习和哈希集成到一个框架中,以评估大规模人员 re-id 的效率和准确性。我们通过将行人图像划分为水平部分来集成空间信息以进行有判别力的视觉表示。具体来说,提出了基于部分的深度哈希(PDH),其中将批量三元组样本用作深度哈希架构的输入。每个三元组样本包含两个具有相同身份的行人图像(或部分)和一个具有不同身份的行人图像(或部分)。使用三元组损失函数,并施加约束,即具有相同身份的行人图像(或部分)的汉明距离小于具有不同身份的行人图像(或部分)的汉明距离。在实验中,我们表明,所提出的 PDH 方法在大规模 Market-1501 和 Market-1501+500K 数据集上产生了非常有竞争力的再识别精度。

相似文献

1
Part-Based Deep Hashing for Large-Scale Person Re-Identification.基于部件的深度哈希用于大规模的行人再识别。
IEEE Trans Image Process. 2017 Oct;26(10):4806-4817. doi: 10.1109/TIP.2017.2695101. Epub 2017 Apr 18.
2
Large-Scale Person Re-Identification Based on Deep Hash Learning.基于深度哈希学习的大规模行人重识别
Entropy (Basel). 2019 Apr 30;21(5):449. doi: 10.3390/e21050449.
3
Fast Open-World Person Re-Identification.快速开放式行人再识别。
IEEE Trans Image Process. 2018 May;27(5):2286-2300. doi: 10.1109/TIP.2017.2740564. Epub 2017 Aug 16.
4
Bit-Scalable Deep Hashing With Regularized Similarity Learning for Image Retrieval and Person Re-Identification.基于正则化相似性学习的位可扩展深度哈希用于图像检索和人员再识别。
IEEE Trans Image Process. 2015 Dec;24(12):4766-79. doi: 10.1109/TIP.2015.2467315. Epub 2015 Aug 11.
5
Hierarchical Recurrent Neural Hashing for Image Retrieval With Hierarchical Convolutional Features.基于层次卷积特征的层次递归神经网络哈希图像检索
IEEE Trans Image Process. 2018;27(1):106-120. doi: 10.1109/TIP.2017.2755766.
6
Euclidean-Distance-Preserved Feature Reduction for efficient person re-identification.基于欧几里得距离保特征降维的高效行人再识别
Neural Netw. 2024 Dec;180:106572. doi: 10.1016/j.neunet.2024.106572. Epub 2024 Aug 8.
7
Deep Discrete Supervised Hashing.深度离散监督哈希。
IEEE Trans Image Process. 2018 Dec;27(12):5996-6009. doi: 10.1109/TIP.2018.2864894. Epub 2018 Aug 10.
8
A General Framework for Linear Distance Preserving Hashing.一种线性距离保持哈希的通用框架。
IEEE Trans Image Process. 2018 Feb;27(2):907-922. doi: 10.1109/TIP.2017.2751150. Epub 2017 Sep 11.
9
Self-Training With Progressive Representation Enhancement for Unsupervised Cross-Domain Person Re-Identification.基于渐进式表示增强的自训练在无监督跨域行人再识别中的应用。
IEEE Trans Image Process. 2021;30:5287-5298. doi: 10.1109/TIP.2021.3082298. Epub 2021 Jun 2.
10
A Multi-Attention Approach for Person Re-Identification Using Deep Learning.基于深度学习的多注意力机制行人再识别方法。
Sensors (Basel). 2023 Apr 2;23(7):3678. doi: 10.3390/s23073678.