Côté Marc-Alexandre, Larochelle Hugo
Department of Computer Science, Université de Sherbrooke, Sherbrooke, QC J1K 2R1, Canada
Neural Comput. 2016 Jul;28(7):1265-88. doi: 10.1162/NECO_a_00848. Epub 2016 May 12.
We present a mathematical construction for the restricted Boltzmann machine (RBM) that does not require specifying the number of hidden units. In fact, the hidden layer size is adaptive and can grow during training. This is obtained by first extending the RBM to be sensitive to the ordering of its hidden units. Then, with a carefully chosen definition of the energy function, we show that the limit of infinitely many hidden units is well defined. As with RBM, approximate maximum likelihood training can be performed, resulting in an algorithm that naturally and adaptively adds trained hidden units during learning. We empirically study the behavior of this infinite RBM, showing that its performance is competitive to that of the RBM, while not requiring the tuning of a hidden layer size.
我们提出了一种受限玻尔兹曼机(RBM)的数学构造,该构造无需指定隐藏单元的数量。事实上,隐藏层大小是自适应的,并且在训练过程中可以增长。这是通过首先将RBM扩展为对其隐藏单元的顺序敏感来实现的。然后,通过精心选择能量函数的定义,我们证明了无限多个隐藏单元的极限是定义良好的。与RBM一样,可以进行近似最大似然训练,从而得到一种在学习过程中自然且自适应地添加已训练隐藏单元的算法。我们通过实证研究了这种无限RBM的行为,结果表明其性能与RBM相当,同时无需调整隐藏层大小。