IEEE Trans Neural Netw Learn Syst. 2013 Apr;24(4):529-41. doi: 10.1109/TNNLS.2012.2235460.
This paper presents a fully complex-valued relaxation network (FCRN) with its projection-based learning algorithm. The FCRN is a single hidden layer network with a Gaussian-like sech activation function in the hidden layer and an exponential activation function in the output layer. For a given number of hidden neurons, the input weights are assigned randomly and the output weights are estimated by minimizing a nonlinear logarithmic function (called as an energy function) which explicitly contains both the magnitude and phase errors. A projection-based learning algorithm determines the optimal output weights corresponding to the minima of the energy function by converting the nonlinear programming problem into that of solving a set of simultaneous linear algebraic equations. The resultant FCRN approximates the desired output more accurately with a lower computational effort. The classification ability of FCRN is evaluated using a set of real-valued benchmark classification problems from the University of California, Irvine machine learning repository. Here, a circular transformation is used to transform the real-valued input features to the complex domain. Next, the FCRN is used to solve three practical problems: a quadrature amplitude modulation channel equalization, an adaptive beamforming, and a mammogram classification. Performance results from this paper clearly indicate the superior classification/approximation performance of the FCRN.
本文提出了一种完全复数值松弛网络(FCRN)及其基于投影的学习算法。FCRN 是一种具有单个隐藏层的网络,在隐藏层中具有高斯样 sech 激活函数,在输出层中具有指数激活函数。对于给定数量的隐藏神经元,输入权重随机分配,输出权重通过最小化非线性对数函数(称为能量函数)来估计,该函数显式包含幅度和相位误差。基于投影的学习算法通过将非线性规划问题转换为求解一组同时线性代数方程组,确定与能量函数最小值对应的最优输出权重。由此产生的 FCRN 以较低的计算开销更准确地逼近所需的输出。FCRN 的分类能力使用来自加利福尼亚大学欧文分校机器学习存储库的一组实值基准分类问题进行评估。在这里,使用圆变换将实值输入特征转换到复数域。接下来,使用 FCRN 解决三个实际问题:正交幅度调制信道均衡、自适应波束形成和乳房 X 线照片分类。本文的性能结果清楚地表明了 FCRN 的优越分类/逼近性能。
IEEE Trans Neural Netw Learn Syst. 2013-4
Neural Netw. 2012-2-14
IEEE Trans Neural Netw. 2011-7
Int J Neural Syst. 2009-8
Neural Netw. 2012-2-16
Int J Neural Syst. 2000-4
Int J Neural Syst. 2008-10
Neural Comput. 2011-12-14
Neural Comput. 2004-12
J Healthc Eng. 2017-6-19