• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

OWLET:一种使用智能手机和网络摄像头记录进行婴儿注视追踪的自动化、开源方法。

OWLET: An automated, open-source method for infant gaze tracking using smartphone and webcam recordings.

机构信息

Department of Population Health, New York University School of Medicine, 227 E 30th St, 7th Fl, New York, NY, 10016, USA.

Department of Child & Adolescent Psychiatry, New York University School of Medicine, New York, NY, 10016, USA.

出版信息

Behav Res Methods. 2023 Sep;55(6):3149-3163. doi: 10.3758/s13428-022-01962-w. Epub 2022 Sep 7.

DOI:10.3758/s13428-022-01962-w
PMID:36070130
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC9450825/
Abstract

Groundbreaking insights into the origins of the human mind have been garnered through the study of eye movements in preverbal subjects who are unable to explain their thought processes. Developmental research has largely relied on in-lab testing with trained experimenters. This constraint provides a narrow window into infant cognition and impedes large-scale data collection in families from diverse socioeconomic, geographic, and cultural backgrounds. Here we introduce a new open-source methodology for automatically analyzing infant eye-tracking data collected on personal devices in the home. Using algorithms from computer vision, machine learning, and ecological psychology, we develop an online webcam-linked eye tracker (OWLET) that provides robust estimation of infants' point of gaze from smartphone and webcam recordings of infant assessments in the home. We validate OWLET in a large sample of 7-month-old infants (N = 127) tested remotely, using an established visual attention task. We show that this new method reliably estimates infants' point-of-gaze across a variety of contexts, including testing on both computers and mobile devices, and exhibits excellent external validity with parental-report measures of attention. Our platform fills a significant gap in current tools available for rapid online data collection and large-scale assessments of cognitive processes in infants. Remote assessment addresses the need for greater diversity and accessibility in human studies and may support the ecological validity of behavioral experiments. This constitutes a critical and timely advance in a core domain of developmental research and in psychological science more broadly.

摘要

通过研究无法解释自己思维过程的言语前受试者的眼球运动,人们对人类思维的起源有了突破性的认识。发展研究在很大程度上依赖于经过训练的实验员在实验室中的测试。这种限制为婴儿认知提供了一个狭隘的视角,并阻碍了来自不同社会经济、地理和文化背景的家庭进行大规模数据收集。在这里,我们引入了一种新的开源方法,用于自动分析在家用个人设备上收集的婴儿眼动追踪数据。我们使用计算机视觉、机器学习和生态心理学中的算法,开发了一种在线网络摄像头链接眼动追踪器(OWLET),该追踪器可以从智能手机和网络摄像头记录的家庭婴儿评估中,对婴儿的注视点进行稳健估计。我们在一个远程测试的 7 个月大婴儿的大样本中(N=127)验证了 OWLET,使用了一个已建立的视觉注意力任务。我们表明,这种新方法可以在各种环境中可靠地估计婴儿的注视点,包括在计算机和移动设备上进行测试,并且与注意力的父母报告测量具有极好的外部有效性。我们的平台填补了当前用于快速在线数据收集和大规模评估婴儿认知过程的工具的重大空白。远程评估满足了人类研究中更大多样性和可及性的需求,并可能支持行为实验的生态有效性。这是发展研究和更广泛的心理科学核心领域的一个关键且及时的进展。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3d19/9450825/30f6cb707c15/13428_2022_1962_Fig11_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3d19/9450825/0c0048bab637/13428_2022_1962_Fig1_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3d19/9450825/b43487b878e5/13428_2022_1962_Fig2_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3d19/9450825/45d0dc90289a/13428_2022_1962_Fig3_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3d19/9450825/bd68142e8e0c/13428_2022_1962_Fig4_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3d19/9450825/f1c352f0fba9/13428_2022_1962_Fig5_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3d19/9450825/d22eae9f989b/13428_2022_1962_Fig6_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3d19/9450825/70ca6527e650/13428_2022_1962_Fig7_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3d19/9450825/8f8662385218/13428_2022_1962_Fig8_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3d19/9450825/c706d5b5bcf1/13428_2022_1962_Fig9_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3d19/9450825/3025185fd5d5/13428_2022_1962_Fig10_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3d19/9450825/30f6cb707c15/13428_2022_1962_Fig11_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3d19/9450825/0c0048bab637/13428_2022_1962_Fig1_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3d19/9450825/b43487b878e5/13428_2022_1962_Fig2_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3d19/9450825/45d0dc90289a/13428_2022_1962_Fig3_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3d19/9450825/bd68142e8e0c/13428_2022_1962_Fig4_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3d19/9450825/f1c352f0fba9/13428_2022_1962_Fig5_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3d19/9450825/d22eae9f989b/13428_2022_1962_Fig6_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3d19/9450825/70ca6527e650/13428_2022_1962_Fig7_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3d19/9450825/8f8662385218/13428_2022_1962_Fig8_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3d19/9450825/c706d5b5bcf1/13428_2022_1962_Fig9_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3d19/9450825/3025185fd5d5/13428_2022_1962_Fig10_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3d19/9450825/30f6cb707c15/13428_2022_1962_Fig11_HTML.jpg

相似文献

1
OWLET: An automated, open-source method for infant gaze tracking using smartphone and webcam recordings.OWLET:一种使用智能手机和网络摄像头记录进行婴儿注视追踪的自动化、开源方法。
Behav Res Methods. 2023 Sep;55(6):3149-3163. doi: 10.3758/s13428-022-01962-w. Epub 2022 Sep 7.
2
From focus to function: Longitudinal insights into infant attention and emerging executive functions via remote webcam eye tracking.从注意力到功能:通过远程网络摄像头眼动追踪对婴儿注意力和新兴执行功能的纵向洞察。
Dev Psychol. 2025 May;61(5):957-963. doi: 10.1037/dev0001948. Epub 2025 Mar 20.
3
Exploration of factors affecting webcam-based automated gaze coding.基于网络摄像头的自动化眼动追踪编码影响因素的探索。
Behav Res Methods. 2024 Oct;56(7):7374-7390. doi: 10.3758/s13428-024-02424-1. Epub 2024 May 1.
4
Comparing Online Webcam- and Laboratory-Based Eye-Tracking for the Assessment of Infants' Audio-Visual Synchrony Perception.比较基于网络摄像头和实验室的眼动追踪技术用于评估婴儿视听同步感知能力
Front Psychol. 2022 Jan 11;12:733933. doi: 10.3389/fpsyg.2021.733933. eCollection 2021.
5
Webcam eye tracking close to laboratory standards: Comparing a new webcam-based system and the EyeLink 1000.基于网络摄像头的眼动追踪接近实验室标准:新型基于网络摄像头的系统与 EyeLink 1000 的比较。
Behav Res Methods. 2024 Aug;56(5):5002-5022. doi: 10.3758/s13428-023-02237-8. Epub 2023 Oct 11.
6
Validation of an open source, remote web-based eye-tracking method (WebGazer) for research in early childhood.验证一种开源的、基于远程网络的眼动追踪方法(WebGazer)在儿童早期研究中的应用。
Infancy. 2024 Jan-Feb;29(1):31-55. doi: 10.1111/infa.12564. Epub 2023 Oct 18.
7
Deep learning models for webcam eye tracking in online experiments.在线实验中使用网络摄像头进行眼动追踪的深度学习模型。
Behav Res Methods. 2024 Apr;56(4):3487-3503. doi: 10.3758/s13428-023-02190-6. Epub 2023 Aug 22.
8
Webcam-based eye tracking to detect mind wandering and comprehension errors.基于网络摄像头的眼动追踪技术以检测思维漫游和理解错误。
Behav Res Methods. 2024 Jan;56(1):1-17. doi: 10.3758/s13428-022-02040-x. Epub 2023 Jan 10.
9
Online eye tracking and real-time sentence processing: On opportunities and efficacy for capturing psycholinguistic effects of different magnitudes and diversity.在线眼动追踪和实时句子处理:捕捉不同大小和多样性的心理语言学效应的机会和效果。
Behav Res Methods. 2024 Apr;56(4):3504-3522. doi: 10.3758/s13428-023-02176-4. Epub 2023 Aug 1.
10
The validation of online webcam-based eye-tracking: The replication of the cascade effect, the novelty preference, and the visual world paradigm.基于网络摄像头的眼动追踪验证:级联效应、新颖偏好和视觉世界范式的复制。
Behav Res Methods. 2024 Aug;56(5):4836-4849. doi: 10.3758/s13428-023-02221-2. Epub 2023 Aug 30.

引用本文的文献

1
Catching up with iCatcher: Comparing analyses of infant eye tracking based on trained human coders and iCatcher+ automated gaze coding software.跟上iCatcher的步伐:比较基于训练有素的人工编码员和iCatcher+自动注视编码软件的婴儿眼动追踪分析
Behav Res Methods. 2025 Apr 28;57(6):158. doi: 10.3758/s13428-025-02683-6.
2
Disease Prediction Using Machine Learning on Smartphone-Based Eye, Skin, and Voice Data: Scoping Review.基于智能手机的眼睛、皮肤和语音数据,利用机器学习进行疾病预测:范围综述。
JMIR AI. 2025 Mar 25;4:e59094. doi: 10.2196/59094.
3
From focus to function: Longitudinal insights into infant attention and emerging executive functions via remote webcam eye tracking.

本文引用的文献

1
iCatcher: A neural network approach for automated coding of young children's eye movements.iCatcher:一种用于自动编码幼儿眼动的神经网络方法。
Infancy. 2022 Jul;27(4):765-779. doi: 10.1111/infa.12468. Epub 2022 Apr 13.
2
Comparing Online Webcam- and Laboratory-Based Eye-Tracking for the Assessment of Infants' Audio-Visual Synchrony Perception.比较基于网络摄像头和实验室的眼动追踪技术用于评估婴儿视听同步感知能力
Front Psychol. 2022 Jan 11;12:733933. doi: 10.3389/fpsyg.2021.733933. eCollection 2021.
3
Innovative methods for remote assessment of neurobehavioral development.
从注意力到功能:通过远程网络摄像头眼动追踪对婴儿注意力和新兴执行功能的纵向洞察。
Dev Psychol. 2025 May;61(5):957-963. doi: 10.1037/dev0001948. Epub 2025 Mar 20.
4
NapBiome trial: Targeting gut microbiota to improve sleep rhythm and developmental and behavioural outcomes in early childhood in a birth cohort in Switzerland - a study protocol.NapBiome试验:针对瑞士一个出生队列中幼儿的肠道微生物群,以改善睡眠节律以及发育和行为结果——一项研究方案。
BMJ Open. 2025 Mar 3;15(3):e092938. doi: 10.1136/bmjopen-2024-092938.
5
The fundamentals of eye tracking part 4: Tools for conducting an eye tracking study.眼动追踪基础 第4部分:进行眼动追踪研究的工具。
Behav Res Methods. 2025 Jan 6;57(1):46. doi: 10.3758/s13428-024-02529-7.
6
Prenatal Stress and Maternal Role in Neurodevelopment.产前应激与母亲在神经发育中的作用
Annu Rev Dev Psychol. 2024 Dec;6:87-107. doi: 10.1146/annurev-devpsych-120321-011905. Epub 2024 Sep 11.
7
Impact of maternal antenatal nutrition and infection treatment interventions on Longitudinal Infant Development and Growth in rural Ethiopia: protocol of the LIDG child follow-up study.埃塞俄比亚农村地区孕产妇产前营养与感染治疗干预措施对婴儿纵向发育和生长的影响:LIDG儿童随访研究方案
BMJ Paediatr Open. 2024 Dec 24;8(1):e002840. doi: 10.1136/bmjpo-2024-002840.
8
Online Eye Tracking for Aphasia: A Feasibility Study Comparing Web and Lab Tracking and Implications for Clinical Use.在线失语症眼动追踪:比较网络和实验室追踪的可行性研究及其对临床应用的影响。
Brain Behav. 2024 Nov;14(11):e70112. doi: 10.1002/brb3.70112.
9
Exploration of factors affecting webcam-based automated gaze coding.基于网络摄像头的自动化眼动追踪编码影响因素的探索。
Behav Res Methods. 2024 Oct;56(7):7374-7390. doi: 10.3758/s13428-024-02424-1. Epub 2024 May 1.
10
Validation of an open source, remote web-based eye-tracking method (WebGazer) for research in early childhood.验证一种开源的、基于远程网络的眼动追踪方法(WebGazer)在儿童早期研究中的应用。
Infancy. 2024 Jan-Feb;29(1):31-55. doi: 10.1111/infa.12564. Epub 2023 Oct 18.
远程评估神经行为发育的创新方法。
Dev Cogn Neurosci. 2021 Dec;52:101015. doi: 10.1016/j.dcn.2021.101015. Epub 2021 Sep 22.
4
Towards a more inclusive and equitable developmental cognitive neuroscience.迈向更具包容性和公平性的发展认知神经科学。
Dev Cogn Neurosci. 2021 Dec;52:101014. doi: 10.1016/j.dcn.2021.101014. Epub 2021 Sep 20.
5
All contexts are not created equal: Social stimuli win the competition for organizing reinforcement learning in 9-month-old infants.并非所有情境都是一样的:社会刺激在9个月大婴儿的强化学习组织竞争中胜出。
Dev Sci. 2021 Sep;24(5):e13088. doi: 10.1111/desc.13088. Epub 2021 Feb 24.
6
Where Infants Look Determines How They See: Eye Movements and Object Perception Performance in 3-Month-Olds.婴儿注视的位置决定其如何视物:3个月大婴儿的眼动与物体感知表现
Infancy. 2004 Sep;6(2):185-201. doi: 10.1207/s15327078in0602_3. Epub 2004 Sep 1.
7
Indexing Early Visual Memory Durability in Infancy.婴儿期早期视觉记忆持久性的标记。
Child Dev. 2021 Mar;92(2):e221-e235. doi: 10.1111/cdev.13450. Epub 2020 Aug 17.
8
Advances in Eye Tracking in Infancy Research.婴儿研究中眼动追踪技术的进展。
Infancy. 2012 Jan;17(1):1-8. doi: 10.1111/j.1532-7078.2011.00101.x. Epub 2011 Nov 1.
9
The emergence of object-based visual attention in infancy: A role for family socioeconomic status and competing visual features.婴儿期基于客体的视觉注意的出现:家庭社会经济地位和竞争性视觉特征的作用。
Infancy. 2019 Sep;24(5):752-767. doi: 10.1111/infa.12309. Epub 2019 Jul 11.
10
Online Developmental Science to Foster Innovation, Access, and Impact.在线发展科学促进创新、获取和影响。
Trends Cogn Sci. 2020 Sep;24(9):675-678. doi: 10.1016/j.tics.2020.06.004. Epub 2020 Jul 2.