Kong Xiangyu, Hu Changhua, Han Chongzhao
Xi'an Research Institute of High Technology, Xi'an, Shaanxi 710025, China.
IEEE Trans Neural Netw. 2010 Jan;21(1):175-81. doi: 10.1109/TNN.2009.2036725. Epub 2009 Dec 4.
The minor component analysis (MCA) deals with the recovery of the eigenvector associated to the smallest eigenvalue of the autocorrelation matrix of the input dada, and it is a very important tool for signal processing and data analysis. This brief analyzes the convergence and stability of a class of self-stabilizing MCA algorithms via a deterministic discrete-time (DDT) method. Some sufficient conditions are obtained to guarantee the convergence of these learning algorithms. Simulations are carried out to further illustrate the theoretical results achieved. It can be concluded that these self-stabilizing algorithms can efficiently extract the minor component (MC), and they outperform some existing MCA methods.
次要成分分析(MCA)用于恢复与输入数据自相关矩阵最小特征值相关联的特征向量,它是信号处理和数据分析的一个非常重要的工具。本简报通过确定性离散时间(DDT)方法分析了一类自稳定MCA算法的收敛性和稳定性。获得了一些充分条件以保证这些学习算法的收敛性。进行了仿真以进一步说明所取得的理论结果。可以得出结论,这些自稳定算法能够有效地提取次要成分(MC),并且它们优于一些现有的MCA方法。