Luo Xudong, Wen Xiaohao, Zhou MengChu, Abusorrah Abdullah, Huang Lukui
IEEE Trans Neural Netw Learn Syst. 2022 Sep;33(9):4173-4183. doi: 10.1109/TNNLS.2021.3055991. Epub 2022 Aug 31.
This work proposes a decision tree (DT)-based method for initializing a dendritic neuron model (DNM). Neural networks become larger and larger, thus consuming more and more computing resources. This calls for a strong need to prune neurons that do not contribute much to their network's output. Pruning those with low contribution may lead to a loss of accuracy of DNM. Our proposed method is novel because 1) it can reduce the number of dendrites in DNM while improving training efficiency without affecting accuracy and 2) it can select proper initialization weight and threshold of neurons. The Adam algorithm is used to train DNM after its initialization with our proposed DT-based method. To verify its effectiveness, we apply it to seven benchmark datasets. The results show that decision-tree-initialized DNM is significantly better than the original DNM, k-nearest neighbor, support vector machine, back-propagation neural network, and DT classification methods. It exhibits the lowest model complexity and highest training speed without losing any accuracy. The interactions among attributes can also be observed in its dendritic neurons.
这项工作提出了一种基于决策树(DT)的方法来初始化树突神经元模型(DNM)。神经网络变得越来越大,因此消耗越来越多的计算资源。这就迫切需要修剪那些对网络输出贡献不大的神经元。修剪那些贡献低的神经元可能会导致DNM准确性的损失。我们提出的方法很新颖,因为:1)它可以减少DNM中的树突数量,同时提高训练效率而不影响准确性;2)它可以选择合适的神经元初始化权重和阈值。在用我们提出的基于DT的方法初始化DNM后,使用Adam算法对其进行训练。为了验证其有效性,我们将其应用于七个基准数据集。结果表明,决策树初始化的DNM明显优于原始DNM、k近邻、支持向量机、反向传播神经网络和DT分类方法。它在不损失任何准确性的情况下展现出最低的模型复杂度和最高的训练速度。在其树突神经元中也可以观察到属性之间的相互作用。