IEEE Trans Image Process. 2017 Nov;26(11):5160-5175. doi: 10.1109/TIP.2017.2729885. Epub 2017 Jul 20.
Despite the fact that different objects possess distinct class-specific features, they also usually share common patterns. This observation has been exploited partially in a recently proposed dictionary learning framework by separating the particularity and the commonality (COPAR). Inspired by this, we propose a novel method to explicitly and simultaneously learn a set of common patterns as well as class-specific features for classification with more intuitive constraints. Our dictionary learning framework is hence characterized by both a shared dictionary and particular (class-specific) dictionaries. For the shared dictionary, we enforce a low-rank constraint, i.e., claim that its spanning subspace should have low dimension and the coefficients corresponding to this dictionary should be similar. For the particular dictionaries, we impose on them the well-known constraints stated in the Fisher discrimination dictionary learning (FDDL). Furthermore, we develop new fast and accurate algorithms to solve the subproblems in the learning step, accelerating its convergence. The said algorithms could also be applied to FDDL and its extensions. The efficiencies of these algorithms are theoretically and experimentally verified by comparing their complexities and running time with those of other well-known dictionary learning methods. Experimental results on widely used image data sets establish the advantages of our method over the state-of-the-art dictionary learning methods.
尽管不同的对象具有独特的类别特定特征,但它们通常也具有共同的模式。这种观察结果在最近提出的字典学习框架中得到了部分利用,该框架通过分离特殊性和共性(COPAR)来实现。受此启发,我们提出了一种新的方法,可以显式地同时学习一组共同的模式以及具有更直观约束的分类的类别特定特征。因此,我们的字典学习框架的特点是同时具有共享字典和特定(类别特定)字典。对于共享字典,我们施加低秩约束,即声称其跨越子空间应该具有低维,并且对应于该字典的系数应该相似。对于特定字典,我们对它们施加 Fisher 判别字典学习(FDDL)中规定的众所周知的约束。此外,我们开发了新的快速准确算法来解决学习步骤中的子问题,加速其收敛。通过比较这些算法的复杂度和运行时间与其他著名的字典学习方法的复杂度和运行时间,理论和实验验证了这些算法的效率。在广泛使用的图像数据集上的实验结果证明了我们的方法相对于最新的字典学习方法的优势。