Moldovan Adrian, Caţaron Angel, Andonie Răzvan
Department of Electronics and Computers, Transilvania University, 500024 Braşov, Romania.
Corporate Technology, Siemens SRL, 500007 Braşov, Romania.
Entropy (Basel). 2020 Jan 16;22(1):102. doi: 10.3390/e22010102.
Current neural networks architectures are many times harder to train because of the increasing size and complexity of the used datasets. Our objective is to design more efficient training algorithms utilizing causal relationships inferred from neural networks. The transfer entropy (TE) was initially introduced as an information transfer measure used to quantify the statistical coherence between events (time series). Later, it was related to causality, even if they are not the same. There are only few papers reporting applications of causality or TE in neural networks. Our contribution is an information-theoretical method for analyzing information transfer between the nodes of feedforward neural networks. The information transfer is measured by the TE of feedback neural connections. Intuitively, TE measures the relevance of a connection in the network and the feedback amplifies this connection. We introduce a backpropagation type training algorithm that uses TE feedback connections to improve its performance.
由于所使用数据集的规模和复杂性不断增加,当前的神经网络架构训练难度要高出许多倍。我们的目标是利用从神经网络推断出的因果关系来设计更高效的训练算法。转移熵(TE)最初被引入作为一种信息传递度量,用于量化事件(时间序列)之间的统计相关性。后来,它与因果关系联系起来,即便二者并不等同。仅有少数论文报道了因果关系或转移熵在神经网络中的应用。我们的贡献在于提出了一种信息论方法,用于分析前馈神经网络节点之间的信息传递。信息传递通过反馈神经连接的转移熵来度量。直观地讲,转移熵衡量了网络中连接的相关性,而反馈会增强这种连接。我们引入了一种反向传播类型的训练算法,该算法使用转移熵反馈连接来提升其性能。