Pedzisz Maciej, Mandic Danilo P
Imperial College London, Department of Electrical and Electronic Engineering, Communications and Signal Processing Group, London SW7 2AZ, U.K.
Neural Comput. 2008 Apr;20(4):1042-64. doi: 10.1162/neco.2008.12-06-418.
A homomorphic feedforward network (HFFN) for nonlinear adaptive filtering is introduced. This is achieved by a two-layer feedforward architecture with an exponential hidden layer and logarithmic preprocessing step. This way, the overall input-output relationship can be seen as a generalized Volterra model, or as a bank of homomorphic filters. Gradient-based learning for this architecture is introduced, together with some practical issues related to the choice of optimal learning parameters and weight initialization. The performance and convergence speed are verified by analysis and extensive simulations. For rigor, the simulations are conducted on artificial and real-life data, and the performances are compared against those obtained by a sigmoidal feedforward network (FFN) with identical topology. The proposed HFFN proved to be a viable alternative to FFNs, especially in the critical case of online learning on small- and medium-scale data sets.
介绍了一种用于非线性自适应滤波的同态前馈网络(HFFN)。这是通过具有指数隐藏层和对数预处理步骤的两层前馈架构实现的。通过这种方式,整体输入-输出关系可以被视为广义Volterra模型,或者同态滤波器组。介绍了基于梯度的该架构学习方法,以及与最优学习参数选择和权重初始化相关的一些实际问题。通过分析和大量仿真验证了性能和收敛速度。为了严谨起见,在人工数据和实际数据上进行了仿真,并将性能与具有相同拓扑结构的Sigmoid前馈网络(FFN)所获得的性能进行了比较。结果表明,所提出的HFFN是FFN的一种可行替代方案,特别是在中小规模数据集上进行在线学习的关键情况下。