Suppr超能文献

基于双曲正切函数的递归神经网络稀疏信号重构。

Sparse signal reconstruction via recurrent neural networks with hyperbolic tangent function.

机构信息

Chongqing Key Laboratory of Nonlinear Circuits and Intelligent Information Processing, School of Electronic and Information Engineering, Southwest University, Chongqing 400715, China.

Department of Mathematics, Texas A&M University at Qatar, Doha 23874, Qatar.

出版信息

Neural Netw. 2022 Sep;153:1-12. doi: 10.1016/j.neunet.2022.05.022. Epub 2022 Jun 2.

Abstract

In this paper, several recurrent neural networks (RNNs) for solving the L-minimization problem are proposed. First, a one-layer RNN based on the hyperbolic tangent function and the projection matrix is designed. In addition, the stability and global convergence of the previously presented RNN are proved by the Lyapunov method. Then, the sliding mode control technique is introduced into the former RNN to design finite-time RNN (FTRNN). Under the condition that the projection matrix satisfies the Restricted Isometry Property (RIP), a suitable Lyapunov function is constructed to prove that the FTRNN is stable in the Lyapunov sense and has the finite-time convergence property. Finally, we make a comparison of the proposed RNN and FTRNN with the existing RNNs. To achieve this, we implement experiments for sparse signal reconstruction and image reconstruction. The results further demonstrate the effectiveness and superior performance of the proposed RNN and FTRNN.

摘要

本文提出了几种用于解决 L-最小化问题的递归神经网络 (RNN)。首先,设计了一种基于双曲正切函数和投影矩阵的单层 RNN。此外,通过 Lyapunov 方法证明了先前提出的 RNN 的稳定性和全局收敛性。然后,将滑模控制技术引入到前一个 RNN 中,设计有限时间 RNN (FTRNN)。在投影矩阵满足约束等距性质 (RIP) 的条件下,构造合适的 Lyapunov 函数证明 FTRNN 在 Lyapunov 意义下是稳定的,并且具有有限时间收敛性质。最后,将所提出的 RNN 和 FTRNN 与现有的 RNN 进行了比较。为此,我们进行了稀疏信号重建和图像重建的实验。结果进一步证明了所提出的 RNN 和 FTRNN 的有效性和优越性能。

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验