Wei Jiao, Tong Can, Wu Bingxue, He Qiang, Qi Shouliang, Yao Yudong, Teng Yueyang
IEEE Trans Neural Netw Learn Syst. 2023 Sep;34(9):5381-5391. doi: 10.1109/TNNLS.2022.3184286. Epub 2023 Sep 1.
Nonnegative matrix factorization (NMF) has been widely used to learn low-dimensional representations of data. However, NMF pays the same attention to all attributes of a data point, which inevitably leads to inaccurate representations. For example, in a human-face dataset, if an image contains a hat on a head, the hat should be removed or the importance of its corresponding attributes should be decreased during matrix factorization. This article proposes a new type of NMF called entropy weighted NMF (EWNMF), which uses an optimizable weight for each attribute of each data point to emphasize their importance. This process is achieved by adding an entropy regularizer to the cost function and then using the Lagrange multiplier method to solve the problem. Experimental results with several datasets demonstrate the feasibility and effectiveness of the proposed method. The code developed in this study is available at https://github.com/Poisson-EM/Entropy-weighted-NMF.
非负矩阵分解(NMF)已被广泛用于学习数据的低维表示。然而,NMF对数据点的所有属性一视同仁,这不可避免地导致表示不准确。例如,在人脸数据集中,如果一幅图像中人物头上戴着帽子,在矩阵分解过程中应该去除帽子,或者降低其相应属性的重要性。本文提出了一种新型的NMF,称为熵加权NMF(EWNMF),它为每个数据点的每个属性使用一个可优化的权重来强调其重要性。这个过程是通过在代价函数中添加一个熵正则化项,然后使用拉格朗日乘数法来解决问题实现的。对几个数据集的实验结果证明了该方法的可行性和有效性。本研究开发的代码可在https://github.com/Poisson-EM/Entropy-weighted-NMF获取。