Bigdeli Siavash A, Lin Geng, Dunbar L Andrea, Portenier Tiziano, Zwicker Matthias
IEEE Trans Neural Netw Learn Syst. 2024 Dec;35(12):17730-17741. doi: 10.1109/TNNLS.2023.3308191. Epub 2024 Dec 2.
Learning probabilistic models that can estimate the density of a given set of samples, and generate samples from that density, is one of the fundamental challenges in unsupervised machine learning. We introduce a new generative model based on denoising density estimators (DDEs), which are scalar functions parametrized by neural networks, that are efficiently trained to represent kernel density estimators of the data. Leveraging DDEs, our main contribution is a novel technique to obtain generative models by minimizing the Kullback-Leibler (KL)-divergence directly. We prove that our algorithm for obtaining generative models is guaranteed to converge consistently to the correct solution. Our approach does not require specific network architecture as in normalizing flows (NFs), nor use ordinary differential equation (ODE) solvers as in continuous NFs. Experimental results demonstrate substantial improvement in density estimation and competitive performance in generative model training.
学习能够估计给定样本集密度并从该密度生成样本的概率模型,是无监督机器学习中的基本挑战之一。我们引入了一种基于去噪密度估计器(DDE)的新生成模型,DDE是由神经网络参数化的标量函数,经过有效训练以表示数据的核密度估计器。利用DDE,我们的主要贡献是一种通过直接最小化库尔贝克-莱布勒(KL)散度来获得生成模型的新技术。我们证明,我们用于获得生成模型的算法保证能一致收敛到正确解。我们的方法既不像归一化流(NF)那样需要特定的网络架构,也不像连续NF那样使用常微分方程(ODE)求解器。实验结果表明,在密度估计方面有显著改进,在生成模型训练方面具有有竞争力的性能。