Vía Javier, Santamaría Ignacio, Pérez Jesús
Department of Communications Engineering, University of Cantabria, 39005 Santander, Cantabria, Spain.
Neural Netw. 2007 Jan;20(1):139-52. doi: 10.1016/j.neunet.2006.09.011. Epub 2006 Nov 17.
Canonical correlation analysis (CCA) is a classical tool in statistical analysis to find the projections that maximize the correlation between two data sets. In this work we propose a generalization of CCA to several data sets, which is shown to be equivalent to the classical maximum variance (MAXVAR) generalization proposed by Kettenring. The reformulation of this generalization as a set of coupled least squares regression problems is exploited to develop a neural structure for CCA. In particular, the proposed CCA model is a two layer feedforward neural network with lateral connections in the output layer to achieve the simultaneous extraction of all the CCA eigenvectors through deflation. The CCA neural model is trained using a recursive least squares (RLS) algorithm. Finally, the convergence of the proposed learning rule is proved by means of stochastic approximation techniques and their performance is analyzed through simulations.
典型相关分析(CCA)是统计分析中的一种经典工具,用于找到能使两个数据集之间的相关性最大化的投影。在这项工作中,我们提出了一种将CCA推广到多个数据集的方法,结果表明它等同于Kettenring提出的经典最大方差(MAXVAR)推广。通过将这种推广重新表述为一组耦合最小二乘回归问题,我们开发了一种用于CCA的神经结构。具体而言,所提出的CCA模型是一个两层前馈神经网络,在输出层具有横向连接,通过收缩来实现所有CCA特征向量的同时提取。CCA神经模型使用递归最小二乘(RLS)算法进行训练。最后,通过随机逼近技术证明了所提出学习规则的收敛性,并通过仿真分析了其性能。