• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

用于跟踪高维数据流中小子空间的神经网络学习算法。

Neural network learning algorithms for tracking minor subspace in high-dimensional data stream.

作者信息

Feng Da-Zheng, Zheng Wei-Xing, Jia Ying

机构信息

National Laboratory for Radar Signal Processing, Xidian University, 710071 Xi'an, PR China.

出版信息

IEEE Trans Neural Netw. 2005 May;16(3):513-21. doi: 10.1109/TNN.2005.844854.

DOI:10.1109/TNN.2005.844854
PMID:15940982
Abstract

A novel random-gradient-based algorithm is developed for online tracking the minor component (MC) associated with the smallest eigenvalue of the autocorrelation matrix of the input vector sequence. The five available learning algorithms for tracking one MC are extended to those for tracking multiple MCs or the minor subspace (MS). In order to overcome the dynamical divergence properties of some available random-gradient-based algorithms, we propose a modification of the Oja-type algorithms, called OJAm, which can work satisfactorily. The averaging differential equation and the energy function associated with the OJAm are given. It is shown that the averaging differential equation will globally asymptotically converge to an invariance set. The corresponding energy or Lyapunov functions exhibit a unique global minimum attained if and only if its state matrices span the MS of the autocorrelation matrix of a vector data stream. The other stationary points are saddle (unstable) points. The globally convergence of OJAm is also studied. The OJAm provides an efficient online learning for tracking the MS. It can track an orthonormal basis of the MS while the other five available algorithms cannot track any orthonormal basis of the MS. The performances of the relative algorithms are shown via computer simulations.

摘要

一种基于随机梯度的新型算法被开发出来,用于在线跟踪与输入向量序列自相关矩阵最小特征值相关的次要分量(MC)。用于跟踪单个MC的五种可用学习算法被扩展到用于跟踪多个MC或次要子空间(MS)的算法。为了克服一些基于随机梯度的可用算法的动态发散特性,我们提出了一种对Oja型算法的改进,称为OJAm,它可以令人满意地工作。给出了与OJAm相关的平均微分方程和能量函数。结果表明,平均微分方程将全局渐近收敛到一个不变集。相应的能量或李雅普诺夫函数当且仅当其状态矩阵跨越向量数据流自相关矩阵的MS时呈现唯一的全局最小值。其他驻点是鞍点(不稳定点)。还研究了OJAm的全局收敛性。OJAm为跟踪MS提供了一种有效的在线学习方法。它可以跟踪MS的正交基,而其他五种可用算法无法跟踪MS的任何正交基。通过计算机模拟展示了相关算法的性能。

相似文献

1
Neural network learning algorithms for tracking minor subspace in high-dimensional data stream.用于跟踪高维数据流中小子空间的神经网络学习算法。
IEEE Trans Neural Netw. 2005 May;16(3):513-21. doi: 10.1109/TNN.2005.844854.
2
A unified self-stabilizing neural network algorithm for principal and minor components extraction.一种用于主成分和次要成分提取的统一自稳定神经网络算法。
IEEE Trans Neural Netw Learn Syst. 2012 Feb;23(2):185-98. doi: 10.1109/TNNLS.2011.2178564.
3
A self-stabilizing MSA algorithm in high-dimension data stream.一种高维数据流中的自稳定 MSA 算法。
Neural Netw. 2010 Sep;23(7):865-71. doi: 10.1016/j.neunet.2010.04.001. Epub 2010 May 8.
4
A neural network learning for adaptively extracting cross-correlation features between two high-dimensional data streams.一种用于自适应提取两个高维数据流之间互相关特征的神经网络学习方法。
IEEE Trans Neural Netw. 2004 Nov;15(6):1541-54. doi: 10.1109/TNN.2004.838523.
5
A weighted information criterion for multiple minor components and its adaptive extraction algorithms.
Neural Netw. 2017 May;89:1-10. doi: 10.1016/j.neunet.2017.02.006. Epub 2017 Feb 16.
6
Convergence analysis of a simple minor component analysis algorithm.一种简单的次要成分分析算法的收敛性分析
Neural Netw. 2007 Sep;20(7):842-50. doi: 10.1016/j.neunet.2007.07.001. Epub 2007 Jul 21.
7
Modulated Hebb-Oja learning rule--a method for principal subspace analysis.调制赫布-奥贾学习规则——一种主子空间分析方法。
IEEE Trans Neural Netw. 2006 Mar;17(2):345-56. doi: 10.1109/TNN.2005.863455.
8
Decision feedback recurrent neural equalization with fast convergence rate.具有快速收敛速率的判决反馈递归神经均衡器。
IEEE Trans Neural Netw. 2005 May;16(3):699-708. doi: 10.1109/TNN.2005.845142.
9
A globally convergent MC algorithm with an adaptive learning rate.一种具有自适应学习率的全局收敛 MC 算法。
IEEE Trans Neural Netw Learn Syst. 2012 Feb;23(2):359-65. doi: 10.1109/TNNLS.2011.2179310.
10
A new adaptive backpropagation algorithm based on Lyapunov stability theory for neural networks.一种基于李雅普诺夫稳定性理论的新型神经网络自适应反向传播算法。
IEEE Trans Neural Netw. 2006 Nov;17(6):1580-91. doi: 10.1109/TNN.2006.880360.