Himavathi S, Anitha D, Muthuramalingam A
Electrical and Electronics Engineering Department, Pondicherry Engineering College, Pondicherry 605014, India.
IEEE Trans Neural Netw. 2007 May;18(3):880-8. doi: 10.1109/TNN.2007.891626.
This paper presents a hardware implementation of multilayer feedforward neural networks (NN) using reconfigurable field-programmable gate arrays (FPGAs). Despite improvements in FPGA densities, the numerous multipliers in an NN limit the size of the network that can be implemented using a single FPGA, thus making NN applications not viable commercially. The proposed implementation is aimed at reducing resource requirement, without much compromise on the speed, so that a larger NN can be realized on a single chip at a lower cost. The sequential processing of the layers in an NN has been exploited in this paper to implement large NNs using a method of layer multiplexing. Instead of realizing a complete network, only the single largest layer is implemented. The same layer behaves as different layers with the help of a control block. The control block ensures proper functioning by assigning the appropriate inputs, weights, biases, and excitation function of the layer that is currently being computed. Multilayer networks have been implemented using Xilinx FPGA "XCV400hq240". The concept used is shown to be very effective in reducing resource requirements at the cost of a moderate overhead on speed. This implementation is proposed to make NN applications viable in terms of cost and speed for online applications. An NN-based flux estimator is implemented in FPGA and the results obtained are presented.
本文介绍了一种使用可重构现场可编程门阵列(FPGA)实现多层前馈神经网络(NN)的硬件方法。尽管FPGA密度有所提高,但神经网络中的大量乘法器限制了使用单个FPGA可实现的网络规模,从而使神经网络应用在商业上不可行。所提出的实现方法旨在减少资源需求,同时在速度上不会有太大妥协,以便能够以更低的成本在单个芯片上实现更大的神经网络。本文利用神经网络各层的顺序处理,采用层复用方法实现大型神经网络。不是实现完整的网络,而是仅实现单个最大的层。借助控制块,同一层可表现为不同的层。控制块通过分配当前正在计算的层的适当输入、权重、偏差和激励函数来确保正常运行。多层网络已使用赛灵思FPGA“XCV400hq240”实现。所采用的概念在以适度的速度开销为代价减少资源需求方面显示出非常有效。提出这种实现方法是为了使神经网络应用在成本和速度方面对于在线应用可行。基于神经网络的磁通估计器在FPGA中实现,并给出了获得的结果。