• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

通过利用成对关系实现离散两步跨模态哈希。

Discrete Two-Step Cross-Modal Hashing through the Exploitation of Pairwise Relations.

机构信息

College of Electrical Engineering and Automation, Shandong University of Science and Technology, Qingdao, China.

School of Software, Shandong University, Jinan, China.

出版信息

Comput Intell Neurosci. 2021 Sep 27;2021:4846043. doi: 10.1155/2021/4846043. eCollection 2021.

DOI:10.1155/2021/4846043
PMID:34616443
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC8490049/
Abstract

The cross-modal hashing method can map heterogeneous multimodal data into a compact binary code that preserves semantic similarity, which can significantly enhance the convenience of cross-modal retrieval. However, the currently available supervised cross-modal hashing methods generally only factorize the label matrix and do not fully exploit the supervised information. Furthermore, these methods often only use one-directional mapping, which results in an unstable hash learning process. To address these problems, we propose a new supervised cross-modal hash learning method called Discrete Two-step Cross-modal Hashing (DTCH) through the exploitation of pairwise relations. Specifically, this method fully exploits the pairwise similarity relations contained in the supervision information: for the label matrix, the hash learning process is stabilized by combining matrix factorization and label regression; for the pairwise similarity matrix, a semirelaxed and semidiscrete strategy is adopted to potentially reduce the cumulative quantization errors while improving the retrieval efficiency and accuracy. The approach further combines an exploration of fine-grained features in the objective function with a novel out-of-sample extension strategy to enable the implicit preservation of consistency between the different modal distributions of samples and the pairwise similarity relations. The superiority of our method was verified through extensive experiments using two widely used datasets.

摘要

跨模态哈希方法可以将异构多模态数据映射到紧凑的二进制代码中,同时保留语义相似性,这可以显著提高跨模态检索的便利性。然而,目前可用的监督跨模态哈希方法通常仅对标签矩阵进行因式分解,并未充分利用监督信息。此外,这些方法通常仅使用单向映射,这导致哈希学习过程不稳定。为了解决这些问题,我们通过利用成对关系提出了一种名为离散两步跨模态哈希(DTCH)的新的监督跨模态哈希学习方法。具体来说,该方法充分利用了监督信息中包含的成对相似关系:对于标签矩阵,通过矩阵分解和标签回归相结合来稳定哈希学习过程;对于成对相似矩阵,采用半松弛和半离散策略,在提高检索效率和准确性的同时,潜在地减少累积量化误差。该方法进一步在目标函数中探索细粒度特征,并采用新颖的样本外扩展策略,以实现样本的不同模态分布和成对相似关系之间的一致性的隐式保留。通过在两个广泛使用的数据集上进行的大量实验,验证了我们方法的优越性。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4bac/8490049/19d0dbea36a3/CIN2021-4846043.003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4bac/8490049/c9a71a1e8d45/CIN2021-4846043.001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4bac/8490049/3d88247957e6/CIN2021-4846043.002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4bac/8490049/19d0dbea36a3/CIN2021-4846043.003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4bac/8490049/c9a71a1e8d45/CIN2021-4846043.001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4bac/8490049/3d88247957e6/CIN2021-4846043.002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4bac/8490049/19d0dbea36a3/CIN2021-4846043.003.jpg

相似文献

1
Discrete Two-Step Cross-Modal Hashing through the Exploitation of Pairwise Relations.通过利用成对关系实现离散两步跨模态哈希。
Comput Intell Neurosci. 2021 Sep 27;2021:4846043. doi: 10.1155/2021/4846043. eCollection 2021.
2
Asymmetric Supervised Consistent and Specific Hashing for Cross-Modal Retrieval.用于跨模态检索的非对称监督一致性和特异性哈希
IEEE Trans Image Process. 2021;30:986-1000. doi: 10.1109/TIP.2020.3038365. Epub 2020 Dec 9.
3
Semantic embedding based online cross-modal hashing method.基于语义嵌入的在线跨模态哈希方法。
Sci Rep. 2024 Jan 6;14(1):736. doi: 10.1038/s41598-023-50242-w.
4
Fast discrete cross-modal hashing with semantic consistency.快速离散跨模态哈希与语义一致性。
Neural Netw. 2020 May;125:142-152. doi: 10.1016/j.neunet.2020.01.035. Epub 2020 Feb 11.
5
Label Consistent Matrix Factorization Hashing for Large-Scale Cross-Modal Similarity Search.用于大规模跨模态相似性搜索的标签一致矩阵分解哈希算法
IEEE Trans Pattern Anal Mach Intell. 2019 Oct;41(10):2466-2479. doi: 10.1109/TPAMI.2018.2861000. Epub 2018 Jul 30.
6
Efficient Semi-Supervised Multimodal Hashing With Importance Differentiation Regression.基于重要性差异回归的高效半监督多模态哈希算法
IEEE Trans Image Process. 2022;31:5881-5892. doi: 10.1109/TIP.2022.3203216. Epub 2022 Sep 13.
7
Random Online Hashing for Cross-Modal Retrieval.用于跨模态检索的随机在线哈希
IEEE Trans Neural Netw Learn Syst. 2025 Jan;36(1):677-691. doi: 10.1109/TNNLS.2023.3330975. Epub 2025 Jan 7.
8
Fast Cross-Modal Hashing With Global and Local Similarity Embedding.具有全局和局部相似性嵌入的快速跨模态哈希
IEEE Trans Cybern. 2022 Oct;52(10):10064-10077. doi: 10.1109/TCYB.2021.3059886. Epub 2022 Sep 19.
9
Supervised Matrix Factorization Hashing for Cross-Modal Retrieval.监督矩阵分解哈希用于跨模态检索。
IEEE Trans Image Process. 2016 Jul;25(7):3157-3166. doi: 10.1109/TIP.2016.2564638. Epub 2016 May 6.
10
Three-Stage Semisupervised Cross-Modal Hashing With Pairwise Relations Exploitation.利用成对关系的三阶段半监督跨模态哈希
IEEE Trans Neural Netw Learn Syst. 2025 Jan;36(1):260-273. doi: 10.1109/TNNLS.2023.3263221. Epub 2025 Jan 7.

本文引用的文献

1
Model Optimization Boosting Framework for Linear Model Hash Learning.线性模型哈希学习的模型优化提升框架
IEEE Trans Image Process. 2020 Feb 5. doi: 10.1109/TIP.2020.2970577.
2
Learning Discriminative Binary Codes for Large-scale Cross-modal Retrieval.学习判别式二进制代码进行大规模跨模态检索。
IEEE Trans Image Process. 2017 May;26(5):2494-2507. doi: 10.1109/TIP.2017.2676345. Epub 2017 Mar 1.
3
Multimodal Discriminative Binary Embedding for Large-Scale Cross-Modal Retrieval.多模态判别式二值嵌入的大规模跨模态检索。
IEEE Trans Image Process. 2016 Oct;25(10):4540-54. doi: 10.1109/TIP.2016.2592800. Epub 2016 Jul 18.
4
SIFT: Predicting amino acid changes that affect protein function.SIFT:预测影响蛋白质功能的氨基酸变化。
Nucleic Acids Res. 2003 Jul 1;31(13):3812-4. doi: 10.1093/nar/gkg509.