• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

基于图拉普拉斯张量分解的图像表示与学习。

Image Representation and Learning With Graph-Laplacian Tucker Tensor Decomposition.

出版信息

IEEE Trans Cybern. 2019 Apr;49(4):1417-1426. doi: 10.1109/TCYB.2018.2802934. Epub 2018 Feb 19.

DOI:10.1109/TCYB.2018.2802934
PMID:29994464
Abstract

Tucker tensor decomposition (TD) is widely used for image representation, reconstruction, and learning tasks. Compared to principal component analysis (PCA) models, tensor models retain more 2-D characteristics of images whereas PCA models linearize images. However, traditional TD involves attribute information only and thus does not consider the pairwise similarity information between images. In this paper, we propose a graph-Laplacian tucker tensor decomposition (GLTD) which explores both attributes and pairwise similarity information simultaneously. Generally, GLTD has three main benefits: 1) GLTD reconstruction shows clear robustness against image occlusions/outliers. We provide analysis to show that Laplacian regularization is mainly responsible to this robustness via an out-of-sample GLTD model. To the best of our knowledge, this Laplacian regularization induced robustness of TD has not been studied or emphasized before; 2) GLTD representation performs more regularity, which improves both unsupervised and supervised learning results; and 3) an effective algorithm is derived to solve GLTD problem. Although GLTD is a noncovex problem, the proposed algorithm is shown experimentally to provide a stable/unique solution starting from different random initializations. Experimental results on image reconstruction, data clustering, and classification tasks show the benefits of GLTD.

摘要

塔克张量分解(TD)广泛用于图像表示、重建和学习任务。与主成分分析(PCA)模型相比,张量模型保留了更多的图像二维特征,而 PCA 模型则对图像进行线性化。然而,传统的 TD 只涉及属性信息,因此不考虑图像之间的成对相似信息。在本文中,我们提出了一种同时探索属性和成对相似信息的图拉普拉斯塔克张量分解(GLTD)。一般来说,GLTD 具有三个主要优点:1)GLTD 重建对图像遮挡/异常值具有明显的鲁棒性。我们提供了分析结果,表明拉普拉斯正则化主要通过样本外 GLTD 模型对此鲁棒性负责。据我们所知,TD 的这种拉普拉斯正则化诱导的鲁棒性以前尚未被研究或强调过;2)GLTD 表示更具正则性,这提高了无监督和监督学习的结果;3)导出了一种有效的算法来解决 GLTD 问题。尽管 GLTD 是非凸问题,但实验表明,所提出的算法从不同的随机初始化开始,能够提供稳定/唯一的解决方案。图像重建、数据聚类和分类任务的实验结果表明了 GLTD 的优势。

相似文献

1
Image Representation and Learning With Graph-Laplacian Tucker Tensor Decomposition.基于图拉普拉斯张量分解的图像表示与学习。
IEEE Trans Cybern. 2019 Apr;49(4):1417-1426. doi: 10.1109/TCYB.2018.2802934. Epub 2018 Feb 19.
2
A Generalized Graph Regularized Non-Negative Tucker Decomposition Framework for Tensor Data Representation.一种用于张量数据表示的广义图正则化非负塔克分解框架
IEEE Trans Cybern. 2022 Jan;52(1):594-607. doi: 10.1109/TCYB.2020.2979344. Epub 2022 Jan 11.
3
PCA Based on Graph Laplacian Regularization and P-Norm for Gene Selection and Clustering.基于图拉普拉斯正则化和P范数的主成分分析用于基因选择和聚类
IEEE Trans Nanobioscience. 2017 Jun;16(4):257-265. doi: 10.1109/TNB.2017.2690365. Epub 2017 Mar 31.
4
Semi-supervised bilinear subspace learning.半监督双线性子空间学习
IEEE Trans Image Process. 2009 Jul;18(7):1671-6. doi: 10.1109/TIP.2009.2018015. Epub 2009 May 12.
5
L1-norm locally linear representation regularization multi-source adaptation learning.L1 范数局部线性表示正则化多源自适应学习。
Neural Netw. 2015 Sep;69:80-98. doi: 10.1016/j.neunet.2015.01.009. Epub 2015 Feb 25.
6
MR-NTD: Manifold Regularization Nonnegative Tucker Decomposition for Tensor Data Dimension Reduction and Representation.MR-NTD:张量数据降维和表示的流形正则化非负 Tucker 分解。
IEEE Trans Neural Netw Learn Syst. 2017 Aug;28(8):1787-1800. doi: 10.1109/TNNLS.2016.2545400.
7
Robust Tensor Decomposition for Image Representation Based on Generalized Correntropy.
IEEE Trans Image Process. 2021;30:150-162. doi: 10.1109/TIP.2020.3033151. Epub 2020 Nov 18.
8
Hypergraph regularized nonnegative triple decomposition for multiway data analysis.用于多路数据分析的超图正则化非负三元分解
Sci Rep. 2024 Apr 20;14(1):9098. doi: 10.1038/s41598-024-59300-3.
9
Feature Extraction for Incomplete Data Via Low-Rank Tensor Decomposition With Feature Regularization.基于特征正则化的低秩张量分解的不完整数据特征提取
IEEE Trans Neural Netw Learn Syst. 2019 Jun;30(6):1803-1817. doi: 10.1109/TNNLS.2018.2873655. Epub 2018 Oct 29.
10
Constrained Clustering With Dissimilarity Propagation-Guided Graph-Laplacian PCA.基于差异传播引导的图拉普拉斯主成分分析的约束聚类
IEEE Trans Neural Netw Learn Syst. 2021 Sep;32(9):3985-3997. doi: 10.1109/TNNLS.2020.3016397. Epub 2021 Aug 31.