• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

用于跨模态检索的非对称监督一致性和特异性哈希

Asymmetric Supervised Consistent and Specific Hashing for Cross-Modal Retrieval.

作者信息

Meng Min, Wang Haitao, Yu Jun, Chen Hui, Wu Jigang

出版信息

IEEE Trans Image Process. 2021;30:986-1000. doi: 10.1109/TIP.2020.3038365. Epub 2020 Dec 9.

DOI:10.1109/TIP.2020.3038365
PMID:33232233
Abstract

Hashing-based techniques have provided attractive solutions to cross-modal similarity search when addressing vast quantities of multimedia data. However, existing cross-modal hashing (CMH) methods face two critical limitations: 1) there is no previous work that simultaneously exploits the consistent or modality-specific information of multi-modal data; 2) the discriminative capabilities of pairwise similarity is usually neglected due to the computational cost and storage overhead. Moreover, to tackle the discrete constraints, relaxation-based strategy is typically adopted to relax the discrete problem to the continuous one, which severely suffers from large quantization errors and leads to sub-optimal solutions. To overcome the above limitations, in this article, we present a novel supervised CMH method, namely Asymmetric Supervised Consistent and Specific Hashing (ASCSH). Specifically, we explicitly decompose the mapping matrices into the consistent and modality-specific ones to sufficiently exploit the intrinsic correlation between different modalities. Meanwhile, a novel discrete asymmetric framework is proposed to fully explore the supervised information, in which the pairwise similarity and semantic labels are jointly formulated to guide the hash code learning process. Unlike existing asymmetric methods, the discrete asymmetric structure developed is capable of solving the binary constraint problem discretely and efficiently without any relaxation. To validate the effectiveness of the proposed approach, extensive experiments on three widely used datasets are conducted and encouraging results demonstrate the superiority of ASCSH over other state-of-the-art CMH methods.

摘要

基于哈希的技术在处理大量多媒体数据时,为跨模态相似性搜索提供了有吸引力的解决方案。然而,现有的跨模态哈希(CMH)方法面临两个关键限制:1)以前没有工作同时利用多模态数据的一致信息或特定模态信息;2)由于计算成本和存储开销,成对相似性的判别能力通常被忽略。此外,为了解决离散约束,通常采用基于松弛的策略将离散问题松弛为连续问题,这严重受到大量化误差的影响,并导致次优解。为了克服上述限制,在本文中,我们提出了一种新颖的监督CMH方法,即非对称监督一致和特定哈希(ASCSH)。具体来说,我们将映射矩阵明确分解为一致矩阵和特定模态矩阵,以充分利用不同模态之间的内在相关性。同时,提出了一种新颖的离散非对称框架来充分探索监督信息,其中成对相似性和语义标签被联合制定以指导哈希码学习过程。与现有的非对称方法不同,所开发的离散非对称结构能够离散且高效地解决二元约束问题,而无需任何松弛。为了验证所提出方法的有效性,在三个广泛使用的数据集上进行了广泛的实验,令人鼓舞的结果证明了ASCSH优于其他现有最先进的CMH方法。

相似文献

1
Asymmetric Supervised Consistent and Specific Hashing for Cross-Modal Retrieval.用于跨模态检索的非对称监督一致性和特异性哈希
IEEE Trans Image Process. 2021;30:986-1000. doi: 10.1109/TIP.2020.3038365. Epub 2020 Dec 9.
2
Discrete Two-Step Cross-Modal Hashing through the Exploitation of Pairwise Relations.通过利用成对关系实现离散两步跨模态哈希。
Comput Intell Neurosci. 2021 Sep 27;2021:4846043. doi: 10.1155/2021/4846043. eCollection 2021.
3
Learning Discriminative Binary Codes for Large-scale Cross-modal Retrieval.学习判别式二进制代码进行大规模跨模态检索。
IEEE Trans Image Process. 2017 May;26(5):2494-2507. doi: 10.1109/TIP.2017.2676345. Epub 2017 Mar 1.
4
Asymmetric Supervised Fusion-Oriented Hashing for Cross-Modal Retrieval.
IEEE Trans Cybern. 2024 Feb;54(2):851-864. doi: 10.1109/TCYB.2023.3241018. Epub 2024 Jan 17.
5
Efficient Semi-Supervised Multimodal Hashing With Importance Differentiation Regression.基于重要性差异回归的高效半监督多模态哈希算法
IEEE Trans Image Process. 2022;31:5881-5892. doi: 10.1109/TIP.2022.3203216. Epub 2022 Sep 13.
6
Fast discrete cross-modal hashing with semantic consistency.快速离散跨模态哈希与语义一致性。
Neural Netw. 2020 May;125:142-152. doi: 10.1016/j.neunet.2020.01.035. Epub 2020 Feb 11.
7
Joint Specifics and Consistency Hash Learning for Large-Scale Cross-Modal Retrieval.用于大规模跨模态检索的关节细节与一致性哈希学习
IEEE Trans Image Process. 2022;31:5343-5358. doi: 10.1109/TIP.2022.3195059. Epub 2022 Aug 16.
8
Structure-aware contrastive hashing for unsupervised cross-modal retrieval.用于无监督跨模态检索的结构感知对比哈希
Neural Netw. 2024 Jun;174:106211. doi: 10.1016/j.neunet.2024.106211. Epub 2024 Feb 27.
9
Label Consistent Matrix Factorization Hashing for Large-Scale Cross-Modal Similarity Search.用于大规模跨模态相似性搜索的标签一致矩阵分解哈希算法
IEEE Trans Pattern Anal Mach Intell. 2019 Oct;41(10):2466-2479. doi: 10.1109/TPAMI.2018.2861000. Epub 2018 Jul 30.
10
Deep Semantic-Preserving Ordinal Hashing for Cross-Modal Similarity Search.用于跨模态相似性搜索的深度语义保持序数哈希
IEEE Trans Neural Netw Learn Syst. 2019 May;30(5):1429-1440. doi: 10.1109/TNNLS.2018.2869601. Epub 2018 Oct 1.

引用本文的文献

1
Semantic embedding based online cross-modal hashing method.基于语义嵌入的在线跨模态哈希方法。
Sci Rep. 2024 Jan 6;14(1):736. doi: 10.1038/s41598-023-50242-w.