• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

基于背景识别和感知组织的户外场景图像分割。

Outdoor scene image segmentation based on background recognition and perceptual organization.

机构信息

Riverbed Technology, Sunnyvale, CA 94085, USA.

出版信息

IEEE Trans Image Process. 2012 Mar;21(3):1007-19. doi: 10.1109/TIP.2011.2169268. Epub 2011 Sep 23.

DOI:10.1109/TIP.2011.2169268
PMID:21947522
Abstract

In this paper, we propose a novel outdoor scene image segmentation algorithm based on background recognition and perceptual organization. We recognize the background objects such as the sky, the ground, and vegetation based on the color and texture information. For the structurally challenging objects, which usually consist of multiple constituent parts, we developed a perceptual organization model that can capture the nonaccidental structural relationships among the constituent parts of the structured objects and, hence, group them together accordingly without depending on a priori knowledge of the specific objects. Our experimental results show that our proposed method outperformed two state-of-the-art image segmentation approaches on two challenging outdoor databases (Gould data set and Berkeley segmentation data set) and achieved accurate segmentation quality on various outdoor natural scene environments.

摘要

在本文中,我们提出了一种新颖的基于背景识别和感知组织的户外场景图像分割算法。我们基于颜色和纹理信息识别背景对象,如天空、地面和植被。对于结构上具有挑战性的物体,它们通常由多个组成部分组成,我们开发了一种感知组织模型,可以捕捉组成部分之间的非偶然结构关系,并相应地将它们组合在一起,而不依赖于特定物体的先验知识。我们的实验结果表明,我们提出的方法在两个具有挑战性的户外数据库(Gould 数据集和 Berkeley 分割数据集)上优于两种最先进的图像分割方法,并在各种户外自然场景环境中实现了准确的分割质量。

相似文献

1
Outdoor scene image segmentation based on background recognition and perceptual organization.基于背景识别和感知组织的户外场景图像分割。
IEEE Trans Image Process. 2012 Mar;21(3):1007-19. doi: 10.1109/TIP.2011.2169268. Epub 2011 Sep 23.
2
Color texture segmentation based on the modal energy of deformable surfaces.基于可变形曲面模态能量的颜色纹理分割
IEEE Trans Image Process. 2009 Jul;18(7):1613-22. doi: 10.1109/TIP.2009.2018002. Epub 2009 May 12.
3
Generalized flooding and Multicue PDE-based image segmentation.广义洪水填充与基于多线索偏微分方程的图像分割
IEEE Trans Image Process. 2008 Mar;17(3):364-76. doi: 10.1109/TIP.2007.916156.
4
Object-level image segmentation using low level cues.基于底层线索的目标级图像分割。
IEEE Trans Image Process. 2013 Oct;22(10):4019-27. doi: 10.1109/TIP.2013.2268973. Epub 2013 Jun 14.
5
Supervised learning-based cell image segmentation for p53 immunohistochemistry.基于监督学习的p53免疫组织化学细胞图像分割
IEEE Trans Biomed Eng. 2006 Jun;53(6):1153-63. doi: 10.1109/TBME.2006.873538.
6
Human perceptual performance with nonliteral imagery: region recognition and texture-based segmentation.人类对非文字图像的感知性能:区域识别与基于纹理的分割
J Exp Psychol Appl. 2004 Jun;10(2):97-110. doi: 10.1037/1076-898X.10.2.97.
7
Auto-context and its application to high-level vision tasks and 3D brain image segmentation.自动上下文及其在高级视觉任务和 3D 脑图像分割中的应用。
IEEE Trans Pattern Anal Mach Intell. 2010 Oct;32(10):1744-57. doi: 10.1109/TPAMI.2009.186.
8
Hierarchical multiple Markov chain model for unsupervised texture segmentation.用于无监督纹理分割的分层多重马尔可夫链模型
IEEE Trans Image Process. 2009 Aug;18(8):1830-43. doi: 10.1109/TIP.2009.2020534. Epub 2009 May 12.
9
Reflection symmetry-integrated image segmentation.反射对称集成图像分割。
IEEE Trans Pattern Anal Mach Intell. 2012 Sep;34(9):1827-41. doi: 10.1109/TPAMI.2011.259.
10
A neuro-fuzzy approach for segmentation of human objects in image sequences.一种用于图像序列中人体目标分割的神经模糊方法。
IEEE Trans Syst Man Cybern B Cybern. 2003;33(3):420-37. doi: 10.1109/TSMCB.2003.811765.

引用本文的文献

1
An Effective Image-Based Tomato Leaf Disease Segmentation Method Using MC-UNet.一种基于图像的使用MC-UNet的番茄叶病分割有效方法。
Plant Phenomics. 2023 May 15;5:0049. doi: 10.34133/plantphenomics.0049. eCollection 2023.