Bengio Yoshua, Monperrus Martin, Larochelle Hugo
Neural Comput. 2006 Oct;18(10):2509-28. doi: 10.1162/neco.2006.18.10.2509.
We claim and present arguments to the effect that a large class of manifold learning algorithms that are essentially local and can be framed as kernel learning algorithms will suffer from the curse of dimensionality, at the dimension of the true underlying manifold. This observation invites an exploration of nonlocal manifold learning algorithms that attempt to discover shared structure in the tangent planes at different positions. A training criterion for such an algorithm is proposed, and experiments estimating a tangent plane prediction function are presented, showing its advantages with respect to local manifold learning algorithms: it is able to generalize very far from training data (on learning handwritten character image rotations), where local nonparametric methods fail.
一大类本质上是局部的且可被构建为核学习算法的流形学习算法,在真实底层流形的维度上会受到维度诅咒的影响。这一观察结果促使人们探索非局部流形学习算法,这类算法试图在不同位置的切平面中发现共享结构。本文提出了此类算法的训练准则,并给出了估计切平面预测函数的实验,展示了其相对于局部流形学习算法的优势:在局部非参数方法失效的情况下(对手写字符图像旋转进行学习时),它能够在远离训练数据的情况下很好地泛化。