• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

Ada-LISTA:自适应于变化模型的学习求解器。

Ada-LISTA: Learned Solvers Adaptive to Varying Models.

出版信息

IEEE Trans Pattern Anal Mach Intell. 2022 Dec;44(12):9222-9235. doi: 10.1109/TPAMI.2021.3125041. Epub 2022 Nov 7.

DOI:10.1109/TPAMI.2021.3125041
PMID:34735338
Abstract

Neural networks that are based on the unfolding of iterative solvers as LISTA (Learned Iterative Soft Shrinkage), are widely used due to their accelerated performance. These networks, trained with a fixed dictionary, are inapplicable in varying model scenarios, as opposed to their flexible non-learned counterparts. We introduce, Ada-LISTA, an adaptive learned solver which receives as input both the signal and its corresponding dictionary, and learns a universal architecture to serve them all. This scheme allows solving sparse coding in linear rate, under varying models, including permutations and perturbations of the dictionary. We provide an extensive theoretical and numerical study, demonstrating the adaptation capabilities of our approach, and its application to the task of natural image inpainting.

摘要

基于迭代求解器展开的神经网络(如 LISTA(Learned Iterative Soft Shrinkage))由于其加速性能而被广泛使用。这些网络使用固定字典进行训练,在不同的模型场景中不适用,而其灵活的非学习对应物则适用。我们引入了 Ada-LISTA,这是一种自适应学习求解器,它接收信号及其对应的字典作为输入,并学习通用架构来为它们提供服务。这种方案允许在线性速率下解决稀疏编码问题,适用于各种模型,包括字典的置换和扰动。我们提供了广泛的理论和数值研究,展示了我们方法的适应能力及其在自然图像修复任务中的应用。

相似文献

1
Ada-LISTA: Learned Solvers Adaptive to Varying Models.Ada-LISTA:自适应于变化模型的学习求解器。
IEEE Trans Pattern Anal Mach Intell. 2022 Dec;44(12):9222-9235. doi: 10.1109/TPAMI.2021.3125041. Epub 2022 Nov 7.
2
Deep Residual Autoencoders for Expectation Maximization-Inspired Dictionary Learning.深度残差自编码器在期望最大化启发式字典学习中的应用。
IEEE Trans Neural Netw Learn Syst. 2021 Jun;32(6):2415-2429. doi: 10.1109/TNNLS.2020.3005348. Epub 2021 Jun 2.
3
Mixing neural networks, continuation and symbolic computation to solve parametric systems of non linear equations.混合神经网络、延拓和符号计算来求解非线性方程组的参数系统。
Neural Netw. 2024 Aug;176:106316. doi: 10.1016/j.neunet.2024.106316. Epub 2024 Apr 12.
4
Sparse deep dictionary learning identifies differences of time-varying functional connectivity in brain neuro-developmental study.稀疏深度字典学习识别大脑神经发育研究中时变功能连接的差异。
Neural Netw. 2021 Mar;135:91-104. doi: 10.1016/j.neunet.2020.12.007. Epub 2020 Dec 23.
5
Improving the Incoherence of a Learned Dictionary via Rank Shrinkage.通过秩收缩改善学习词典的非相干性
Neural Comput. 2017 Jan;29(1):263-285. doi: 10.1162/NECO_a_00907. Epub 2016 Oct 20.
6
Fast dictionary learning from incomplete data.从不完整数据中快速学习字典。
EURASIP J Adv Signal Process. 2018;2018(1):12. doi: 10.1186/s13634-018-0533-0. Epub 2018 Feb 22.
7
Joint and Direct Optimization for Dictionary Learning in Convolutional Sparse Representation.卷积稀疏表示中字典学习的联合与直接优化
IEEE Trans Neural Netw Learn Syst. 2020 Feb;31(2):559-573. doi: 10.1109/TNNLS.2019.2906074. Epub 2019 Apr 19.
8
Multimodal Task-Driven Dictionary Learning for Image Classification.基于多模态任务驱动的图像分类词典学习
IEEE Trans Image Process. 2016 Jan;25(1):24-38. doi: 10.1109/TIP.2015.2496275. Epub 2015 Oct 30.
9
An end-to-end-trainable iterative network architecture for accelerated radial multi-coil 2D cine MR image reconstruction.一种用于加速径向多通道 2D 电影磁共振图像重建的端到端可训练迭代网络架构。
Med Phys. 2021 May;48(5):2412-2425. doi: 10.1002/mp.14809. Epub 2021 Apr 1.
10
Sparse coding with memristor networks.基于忆阻器网络的稀疏编码
Nat Nanotechnol. 2017 Aug;12(8):784-789. doi: 10.1038/nnano.2017.83. Epub 2017 May 22.

引用本文的文献

1
SALSA-Net: Explainable Deep Unrolling Networks for Compressed Sensing.SALSA-Net:用于压缩感知的可解释深度展开网络。
Sensors (Basel). 2023 May 28;23(11):5142. doi: 10.3390/s23115142.