Suppr超能文献

可扩展在线卷积稀疏编码。

Scalable Online Convolutional Sparse Coding.

出版信息

IEEE Trans Image Process. 2018 Oct;27(10):4850-4859. doi: 10.1109/TIP.2018.2842152.

Abstract

Convolutional sparse coding (CSC) improves sparse coding by learning a shift-invariant dictionary from the data. However, most existing CSC algorithms operate in the batch mode and are computationally expensive. In this paper, we alleviate this problem by online learning. The key is a reformulation of the CSC objective so that convolution can be handled easily in the frequency domain, and much smaller history matrices are needed. To solve the resultant optimization problem, we use the alternating direction method of multipliers (ADMMs), and its subproblems have efficient closed-form solutions. Theoretical analysis shows that the learned dictionary converges to a stationary point of the optimization problem. Extensive experiments are performed on both the standard CSC benchmark data sets and much larger data sets such as the ImageNet. Results show that the proposed algorithm outperforms the state-of-the-art batch and online CSC methods. It is more scalable, has faster convergence, and better reconstruction performance.

摘要

卷积稀疏编码 (CSC) 通过从数据中学习平移不变字典来改进稀疏编码。然而,大多数现有的 CSC 算法以批处理模式运行,计算成本很高。在本文中,我们通过在线学习来缓解这个问题。关键是对 CSC 目标进行重新表述,以便可以在频域中轻松处理卷积,并且需要更小的历史矩阵。为了解决由此产生的优化问题,我们使用交替方向乘子法 (ADMM),并且其子问题具有有效的闭式解。理论分析表明,学习到的字典收敛到优化问题的一个稳定点。我们在标准 CSC 基准数据集以及更大的数据集(如 ImageNet)上进行了广泛的实验。结果表明,所提出的算法优于最新的批量和在线 CSC 方法。它更具可扩展性,具有更快的收敛速度和更好的重建性能。

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验