文献检索文档翻译深度研究
Suppr Zotero 插件Zotero 插件
邀请有礼套餐&价格历史记录

新学期,新优惠

限时优惠:9月1日-9月22日

30天高级会员仅需29元

1天体验卡首发特惠仅需5.99元

了解详情
不再提醒
插件&应用
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
高级版
套餐订阅购买积分包
AI 工具
文献检索文档翻译深度研究
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2025

Gorillas in our midst: sustained inattentional blindness for dynamic events.

作者信息

Simons D J, Chabris C F

机构信息

Department of Psychology, Harvard University, Cambridge, MA 02138, USA.

出版信息

Perception. 1999;28(9):1059-74. doi: 10.1068/p281059.


DOI:10.1068/p281059
PMID:10694957
Abstract

With each eye fixation, we experience a richly detailed visual world. Yet recent work on visual integration and change direction reveals that we are surprisingly unaware of the details of our environment from one view to the next: we often do not detect large changes to objects and scenes ('change blindness'). Furthermore, without attention, we may not even perceive objects ('inattentional blindness'). Taken together, these findings suggest that we perceive and remember only those objects and details that receive focused attention. In this paper, we briefly review and discuss evidence for these cognitive forms of 'blindness'. We then present a new study that builds on classic studies of divided visual attention to examine inattentional blindness for complex objects and events in dynamic scenes. Our results suggest that the likelihood of noticing an unexpected object depends on the similarity of that object to other objects in the display and on how difficult the priming monitoring task is. Interestingly, spatial proximity of the critical unattended object to attended locations does not appear to affect detection, suggesting that observers attend to objects and events, not spatial positions. We discuss the implications of these results for visual representations and awareness of our visual environment.

摘要

相似文献

[1]
Gorillas in our midst: sustained inattentional blindness for dynamic events.

Perception. 1999

[2]
How not to be seen: the contribution of similarity and selective ignoring to sustained inattentional blindness.

Psychol Sci. 2001-1

[3]
The effects of eye movements, spatial attention, and stimulus features on inattentional blindness.

Vision Res. 2004-12

[4]
Similarity of an unexpected object to the attended and ignored objects affects noticing in a sustained inattentional blindness task.

Atten Percept Psychophys. 2023-10

[5]
The effects of eye movements, age, and expertise on inattentional blindness.

Conscious Cogn. 2006-9

[6]
The role of unattended distractors in sustained inattentional blindness.

Psychol Res. 2008-1

[7]
What's past is past: Neither perceptual preactivation nor prior motivational relevance decrease subsequent inattentional blindness.

Conscious Cogn. 2018-3

[8]
Expectation-based blindness: Predictions about object categories gate awareness of focally attended objects.

Psychon Bull Rev. 2022-10

[9]
Not quite so blind: Semantic processing despite inattentional blindness.

J Exp Psychol Hum Percept Perform. 2016-4

[10]
Inattentional blindness on the full-attention trial: Are we throwing out the baby with the bathwater?

Conscious Cogn. 2018-3

引用本文的文献

[1]
Health Side Story: Scoping Review of Literature on Narrative Therapy for ADHD.

Healthcare (Basel). 2025-5-26

[2]
Invisible Gorillas in the Mind: Internal Inattentional Blindness and the Prospect of Introspection Training.

Open Mind (Camb). 2025-4-22

[3]
Sensitivity to visual features in inattentional blindness.

Elife. 2025-5-19

[4]
Testing the top-down feedback in the central visual field using the reversed depth illusion.

iScience. 2025-3-15

[5]
Attentional capture by abrupt onsets: Foundations and emerging issues.

J Exp Psychol Hum Percept Perform. 2025-3

[6]
Integrative mindfulness-based infant parenting program: theoretical foundations and a novel intervention protocol.

Front Psychol. 2025-2-7

[7]
Does familiarity-detection flip attention inward? The familiarity-flip-of-attention account of the primacy effect in memory for repetitions.

Mem Cognit. 2025-1-7

[8]
Selective learning for sensing using shift-invariant spectrally stable undersampled networks.

Sci Rep. 2024-12-30

[9]
The unbearable slowness of being: Why do we live at 10 bits/s?

Neuron. 2025-1-22

[10]
Ensemble representation of animacy could be based on mid-level visual features.

Atten Percept Psychophys. 2025-2

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

推荐工具

医学文档翻译智能文献检索