IEEE Trans Neural Netw Learn Syst. 2012 Feb;23(2):359-65. doi: 10.1109/TNNLS.2011.2179310.
This brief deals with the problem of minor component analysis (MCA). Artificial neural networks can be exploited to achieve the task of MCA. Recent research works show that convergence of neural networks based MCA algorithms can be guaranteed if the learning rates are less than certain thresholds. However, the computation of these thresholds needs information about the eigenvalues of the autocorrelation matrix of data set, which is unavailable in online extraction of minor component from input data stream. In this correspondence, we introduce an adaptive learning rate into the OJAn MCA algorithm, such that its convergence condition does not depend on any unobtainable information, and can be easily satisfied in practical applications.
本简档涉及次要分量分析(MCA)的问题。可以利用人工神经网络来完成 MCA 的任务。最近的研究工作表明,如果学习率小于某些阈值,则可以保证基于神经网络的 MCA 算法的收敛性。然而,这些阈值的计算需要数据集自相关矩阵特征值的信息,而在从输入数据流中在线提取次要分量时,这些信息是不可用的。在本通信中,我们将自适应学习率引入 OJAn MCA 算法中,使得其收敛条件不依赖于任何不可获得的信息,并且可以在实际应用中很容易满足。