Suppr超能文献

通过秩收缩改善学习词典的非相干性

Improving the Incoherence of a Learned Dictionary via Rank Shrinkage.

作者信息

Ubaru Shashanka, Seghouane Abd-Krim, Saad Yousef

机构信息

Department of Computer Science and Engineering, University of Minnesota, Twin Cities, MN 55455, U.S.A.

Department of Electrical and Electronic Engineering, University of Melbourne, Melbourne 3010, Victoria, Australia

出版信息

Neural Comput. 2017 Jan;29(1):263-285. doi: 10.1162/NECO_a_00907. Epub 2016 Oct 20.

Abstract

This letter considers the problem of dictionary learning for sparse signal representation whose atoms have low mutual coherence. To learn such dictionaries, at each step, we first update the dictionary using the method of optimal directions (MOD) and then apply a dictionary rank shrinkage step to decrease its mutual coherence. In the rank shrinkage step, we first compute a rank 1 decomposition of the column-normalized least squares estimate of the dictionary obtained from the MOD step. We then shrink the rank of this learned dictionary by transforming the problem of reducing the rank to a nonnegative garrotte estimation problem and solving it using a path-wise coordinate descent approach. We establish theoretical results that show that the rank shrinkage step included will reduce the coherence of the dictionary, which is further validated by experimental results. Numerical experiments illustrating the performance of the proposed algorithm in comparison to various other well-known dictionary learning algorithms are also presented.

摘要

本文探讨了用于稀疏信号表示的字典学习问题,其中字典原子具有低互相关性。为了学习这样的字典,在每一步中,我们首先使用最优方向法(MOD)更新字典,然后应用字典秩收缩步骤来降低其互相关性。在秩收缩步骤中,我们首先对从MOD步骤获得的字典的列归一化最小二乘估计进行秩1分解。然后,通过将降低秩的问题转化为非负约束估计问题并使用逐路径坐标下降法求解,来收缩这个学习到的字典的秩。我们建立了理论结果,表明所包含的秩收缩步骤将降低字典的相关性,实验结果进一步验证了这一点。还给出了数值实验,展示了所提出算法与其他各种著名字典学习算法相比的性能。

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验