Shang Mingsheng, Yuan Ye, Luo Xin, Zhou MengChu
IEEE Trans Cybern. 2022 Aug;52(8):8006-8018. doi: 10.1109/TCYB.2020.3026425. Epub 2022 Jul 19.
To quantify user-item preferences, a recommender system (RS) commonly adopts a high-dimensional and sparse (HiDS) matrix. Such a matrix can be represented by a non-negative latent factor analysis model relying on a single latent factor (LF)-dependent, non-negative, and multiplicative update algorithm. However, existing models' representative abilities are limited due to their specialized learning objective. To address this issue, this study proposes an α- β -divergence-generalized model that enjoys fast convergence. Its ideas are three-fold: 1) generalizing its learning objective with α- β -divergence to achieve highly accurate representation of HiDS data; 2) incorporating a generalized momentum method into parameter learning for fast convergence; and 3) implementing self-adaptation of controllable hyperparameters for excellent practicability. Empirical studies on six HiDS matrices from real RSs demonstrate that compared with state-of-the-art LF models, the proposed one achieves significant accuracy and efficiency gain to estimate huge missing data in an HiDS matrix.
为了量化用户-项目偏好,推荐系统(RS)通常采用高维稀疏(HiDS)矩阵。这样的矩阵可以由一个基于单一潜在因子(LF)依赖、非负且乘法更新算法的非负潜在因子分析模型来表示。然而,由于现有模型的专门学习目标,其代表性能力有限。为了解决这个问题,本研究提出了一种具有快速收敛性的α-β散度广义模型。其思路有三点:1)用α-β散度推广其学习目标,以实现对HiDS数据的高精度表示;2)将广义动量法纳入参数学习以实现快速收敛;3)实现可控超参数的自适应以获得出色的实用性。对来自真实推荐系统的六个HiDS矩阵的实证研究表明,与最先进的LF模型相比,所提出的模型在估计HiDS矩阵中的大量缺失数据时,在准确性和效率方面都有显著提高。