Suppr超能文献

基于高斯混合模型的局部模式联合分布对纹理分类。

Texture classification by modeling joint distributions of local patterns with gaussian mixtures.

机构信息

Institute of Imaging and Computer Vision, RWTH Aachen University, Germany.

出版信息

IEEE Trans Image Process. 2010 Jun;19(6):1548-57. doi: 10.1109/TIP.2010.2042100. Epub 2010 Feb 2.

Abstract

Texture classification generally requires the analysis of patterns in local pixel neighborhoods. Statistically, the underlying processes are comprehensively described by their joint probability density functions (jPDFs). Even for small neighborhoods, however, stable estimation of jPDFs by joint histograms (jHSTs) is often infeasible, since the number of entries in the jHST exceeds by far the number of pixels in a typical texture region. Moreover, evaluation of distance functions between jHSTs is often computationally prohibitive. Practically, the number of entries in a jHST is therefore reduced by considering only two-pixel patterns, leading to 2D-jHSTs known as cooccurrence matrices, or by quantization of the gray levels in local patterns to only two gray levels, yielding local binary patterns (LBPs). Both approaches result in a loss of information. We introduce here a framework for supervised texture classification which reduces or avoids this information loss. Local texture neighborhoods are first filtered by a filter bank. Without further quantization, the jPDF of the filter responses is then described parametrically by gaussian mixture models (GMMs). We show that the parameters of the GMMs can be reliably estimated from small image regions. Moreover, distances between the thus modelled jPDFs of different texture patterns can be computed efficiently in closed form from their model parameters. We furthermore extend this texture descriptor to achieve full invariance to rotation. We evaluate the framework for different filter banks on the Brodatz texture set. We first show that combining the LBP difference filters with the GMM-based density estimator outperforms the classical LBP approach and its codebook extensions. When replacing these-rather elementary-difference filters by the wavelet frame transform (WFT), the performance of the framework on all 111 Brodatz textures exceeds the one obtained more recently by spin image and RIFT descriptors by Lazebnik et al.

摘要

纹理分类通常需要分析局部像素邻域中的模式。从统计学角度来看,基础过程可以通过它们的联合概率密度函数 (jPDF) 全面描述。然而,即使对于小邻域,通过联合直方图 (jHST) 对 jPDF 进行稳定估计通常也是不可行的,因为 jHST 中的条目数远远超过典型纹理区域中的像素数。此外,评估 jHST 之间的距离函数通常在计算上是不可行的。实际上,通过仅考虑两个像素模式,或者通过将局部模式中的灰度值量化为仅两个灰度级,从而导致二维联合直方图(称为共生矩阵)或局部二进制模式(LBP),可以减少 jHST 中的条目数。这两种方法都会导致信息丢失。我们在这里引入了一种用于监督纹理分类的框架,该框架可以减少或避免这种信息丢失。首先通过滤波器组对局部纹理邻域进行滤波。无需进一步量化,然后通过高斯混合模型 (GMM) 参数化描述滤波器响应的 jPDF。我们表明,可以从小图像区域可靠地估计 GMM 的参数。此外,可以从其模型参数以闭式形式有效地计算不同纹理模式的 jPDF 之间的距离。我们还进一步扩展了这个纹理描述符,以实现对旋转的完全不变性。我们在 Brodatz 纹理集上评估了不同滤波器组的框架。我们首先表明,将 LBP 差分滤波器与基于 GMM 的密度估计器相结合,优于经典的 LBP 方法及其代码本扩展。当用小波帧变换 (WFT) 替换这些相当基本的差分滤波器时,该框架在所有 111 个 Brodatz 纹理上的性能都超过了 Lazebnik 等人最近提出的自旋图像和 RIFT 描述符。

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验