Suppr超能文献

突触随机性在训练低精度神经网络中的作用。

Role of Synaptic Stochasticity in Training Low-Precision Neural Networks.

机构信息

Bocconi Institute for Data Science and Analytics, Bocconi University, Milano 20136, Italy.

Italian Institute for Genomic Medicine, Torino 10126, Italy.

出版信息

Phys Rev Lett. 2018 Jun 29;120(26):268103. doi: 10.1103/PhysRevLett.120.268103.

Abstract

Stochasticity and limited precision of synaptic weights in neural network models are key aspects of both biological and hardware modeling of learning processes. Here we show that a neural network model with stochastic binary weights naturally gives prominence to exponentially rare dense regions of solutions with a number of desirable properties such as robustness and good generalization performance, while typical solutions are isolated and hard to find. Binary solutions of the standard perceptron problem are obtained from a simple gradient descent procedure on a set of real values parametrizing a probability distribution over the binary synapses. Both analytical and numerical results are presented. An algorithmic extension that allows to train discrete deep neural networks is also investigated.

摘要

在神经网络模型中,突触权重的随机性和有限精度是学习过程的生物和硬件建模的关键方面。在这里,我们表明,具有随机二进制权重的神经网络模型自然会突出具有许多理想特性(如鲁棒性和良好的泛化性能)的解的指数罕见密集区域,而典型的解则是孤立的且难以找到。标准感知器问题的二进制解是通过在一组实数上进行简单的梯度下降过程获得的,该实数参数化了二进制突触上的概率分布。本文同时给出了理论和数值结果。还研究了一种允许训练离散深度神经网络的算法扩展。

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验