Sasaki Hiroaki, Tangkaratt Voot, Niu Gang, Sugiyama Masashi
Graduate School of Information Science, Nara Institute of Science and Technology, Nara 630-0192, Japan
Center for Advanced Intelligence Project, RIKEN, Tokyo 103-0027, Japan
Neural Comput. 2018 Feb;30(2):477-504. doi: 10.1162/neco_a_01035. Epub 2017 Nov 21.
Sufficient dimension reduction (SDR) is aimed at obtaining the low-rank projection matrix in the input space such that information about output data is maximally preserved. Among various approaches to SDR, a promising method is based on the eigendecomposition of the outer product of the gradient of the conditional density of output given input. In this letter, we propose a novel estimator of the gradient of the logarithmic conditional density that directly fits a linear-in-parameter model to the true gradient under the squared loss. Thanks to this simple least-squares formulation, its solution can be computed efficiently in a closed form. Then we develop a new SDR method based on the proposed gradient estimator. We theoretically prove that the proposed gradient estimator, as well as the SDR solution obtained from it, achieves the optimal parametric convergence rate. Finally, we experimentally demonstrate that our SDR method compares favorably with existing approaches in both accuracy and computational efficiency on a variety of artificial and benchmark data sets.
充分降维(SDR)旨在在输入空间中获得低秩投影矩阵,以便最大程度地保留有关输出数据的信息。在各种SDR方法中,一种很有前景的方法是基于给定输入时输出条件密度梯度的外积的特征分解。在这封信中,我们提出了一种对数条件密度梯度的新型估计器,该估计器在平方损失下将参数线性模型直接拟合到真实梯度。由于这种简单的最小二乘公式,其解可以以封闭形式有效地计算出来。然后,我们基于所提出的梯度估计器开发了一种新的SDR方法。我们从理论上证明,所提出的梯度估计器以及从中获得的SDR解实现了最优的参数收敛速度。最后,我们通过实验证明,在各种人工数据集和基准数据集上,我们的SDR方法在准确性和计算效率方面均优于现有方法。