• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

透视事件:基于事件相机的视障人士实时移动物体可听化

Seeing through Events: Real-Time Moving Object Sonification for Visually Impaired People Using Event-Based Camera.

机构信息

National Engineering Research Center of Optical Instrumentation, Zhejiang University, Hangzhou 310058, China.

Institute for Anthropomatics and Robotics, Karlsruhe Institute of Technology, 76131 Karlsruhe, Germany.

出版信息

Sensors (Basel). 2021 May 20;21(10):3558. doi: 10.3390/s21103558.

DOI:10.3390/s21103558
PMID:34065360
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC8161033/
Abstract

Scene sonification is a powerful technique to help Visually Impaired People (VIP) understand their surroundings. Existing methods usually perform sonification on the entire images of the surrounding scene acquired by a standard camera or on the priori static obstacles acquired by image processing algorithms on the RGB image of the surrounding scene. However, if all the information in the scene are delivered to VIP simultaneously, it will cause information redundancy. In fact, biological vision is more sensitive to moving objects in the scene than static objects, which is also the original intention of the event-based camera. In this paper, we propose a real-time sonification framework to help VIP understand the moving objects in the scene. First, we capture the events in the scene using an event-based camera and cluster them into multiple moving objects without relying on any prior knowledge. Then, sonification based on MIDI is enabled on these objects synchronously. Finally, we conduct comprehensive experiments on the scene video with sonification audio attended by 20 VIP and 20 Sighted People (SP). The results show that our method allows both participants to clearly distinguish the number, size, motion speed, and motion trajectories of multiple objects. The results show that our method is more comfortable to hear than existing methods in terms of aesthetics.

摘要

场景声化为帮助视障人士(VIP)理解周围环境提供了一种强大的技术。现有的方法通常对标准相机获取的周围场景的整个图像或对周围场景的 RGB 图像进行图像处理算法获取的 priori 静态障碍物进行声化。然而,如果同时将场景中的所有信息传递给 VIP,会导致信息冗余。事实上,生物视觉对场景中的运动物体比对静态物体更敏感,这也是事件相机的初衷。在本文中,我们提出了一个实时声化框架,以帮助 VIP 理解场景中的运动物体。首先,我们使用事件相机捕获场景中的事件,并在不依赖任何先验知识的情况下将它们聚类成多个运动物体。然后,对这些物体同步启用基于 MIDI 的声化。最后,我们对带有声化音频的场景视频进行了综合实验,有 20 名 VIP 和 20 名明眼人(SP)参加。结果表明,我们的方法允许参与者清楚地分辨出多个物体的数量、大小、运动速度和运动轨迹。结果表明,在美学方面,我们的方法比现有的方法更舒适。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a777/8161033/79fdd761d24d/sensors-21-03558-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a777/8161033/b24e120d0a0d/sensors-21-03558-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a777/8161033/fb2c688b7a15/sensors-21-03558-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a777/8161033/79fdd761d24d/sensors-21-03558-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a777/8161033/b24e120d0a0d/sensors-21-03558-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a777/8161033/fb2c688b7a15/sensors-21-03558-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a777/8161033/79fdd761d24d/sensors-21-03558-g003.jpg

相似文献

1
Seeing through Events: Real-Time Moving Object Sonification for Visually Impaired People Using Event-Based Camera.透视事件:基于事件相机的视障人士实时移动物体可听化
Sensors (Basel). 2021 May 20;21(10):3558. doi: 10.3390/s21103558.
2
A Comparative Study in Real-Time Scene Sonification for Visually Impaired People.实时场景可听化在视障人群中的对比研究。
Sensors (Basel). 2020 Jun 5;20(11):3222. doi: 10.3390/s20113222.
3
Visual Echolocation Concept for the Colorophone Sensory Substitution Device Using Virtual Reality.基于虚拟现实的彩色声音感觉替代装置的视觉回声定位概念。
Sensors (Basel). 2021 Jan 1;21(1):237. doi: 10.3390/s21010237.
4
ARAware: Assisting Visually Impaired People with Real-Time Critical Moving Object Identification.ARAware:实时关键移动物体识别辅助视障人士
Sensors (Basel). 2024 Jul 1;24(13):4282. doi: 10.3390/s24134282.
5
A review of sonification solutions in assistive systems for visually impaired people.助视系统中声音化解决方案的综述。
Disabil Rehabil Assist Technol. 2024 Nov;19(8):2818-2833. doi: 10.1080/17483107.2024.2326590. Epub 2024 Mar 12.
6
Deep learning based object detection and surrounding environment description for visually impaired people.基于深度学习的视障人士目标检测与周围环境描述
Heliyon. 2023 Jun 7;9(6):e16924. doi: 10.1016/j.heliyon.2023.e16924. eCollection 2023 Jun.
7
Dynamic Crosswalk Scene Understanding for the Visually Impaired.为视障人士设计的动态过街场景理解
IEEE Trans Neural Syst Rehabil Eng. 2021;29:1478-1486. doi: 10.1109/TNSRE.2021.3096379. Epub 2021 Jul 29.
8
An image processing approach for blind mobility facilitated through visual intracortical stimulation.基于视觉皮层内刺激的盲动性图像处理方法。
Artif Organs. 2012 Jul;36(7):616-28. doi: 10.1111/j.1525-1594.2011.01421.x. Epub 2012 Mar 16.
9
A Saccade Based Framework for Real-Time Motion Segmentation Using Event Based Vision Sensors.一种基于扫视的框架,用于使用基于事件的视觉传感器进行实时运动分割。
Front Neurosci. 2017 Mar 3;11:83. doi: 10.3389/fnins.2017.00083. eCollection 2017.
10
Colorophone 2.0: A Wearable Color Sonification Device Generating Live Stereo-Soundscapes-Design, Implementation, and Usability Audit.Colorophone 2.0:一种可穿戴颜色声音化设备,实时生成立体声音景——设计、实现和可用性审计。
Sensors (Basel). 2021 Nov 5;21(21):7351. doi: 10.3390/s21217351.

引用本文的文献

1
Concurrent Supra-Postural Auditory-Hand Coordination Task Affects Postural Control: Using Sonification to Explore Environmental Unpredictability in Factors Affecting Fall Risk.同步超姿势听觉-手协调任务影响姿势控制:使用声音化探索影响跌倒风险因素中的环境不可预测性。
Sensors (Basel). 2024 Mar 21;24(6):1994. doi: 10.3390/s24061994.
2
Illumination-Based Color Reconstruction for the Dynamic Vision Sensor.用于动态视觉传感器的基于照明的颜色重建
Sensors (Basel). 2023 Oct 9;23(19):8327. doi: 10.3390/s23198327.
3
Event-Based Motion Capture System for Online Multi-Quadrotor Localization and Tracking.

本文引用的文献

1
Event-Based Vision: A Survey.基于事件的视觉:综述。
IEEE Trans Pattern Anal Mach Intell. 2022 Jan;44(1):154-180. doi: 10.1109/TPAMI.2020.3008413. Epub 2021 Dec 7.
2
A Comparative Study in Real-Time Scene Sonification for Visually Impaired People.实时场景可听化在视障人群中的对比研究。
Sensors (Basel). 2020 Jun 5;20(11):3222. doi: 10.3390/s20113222.
3
Unifying Terrain Awareness for the Visually Impaired through Real-Time Semantic Segmentation.通过实时语义分割实现视障人士地形感知的统一。
基于事件的多旋翼在线定位与跟踪运动捕捉系统。
Sensors (Basel). 2022 Apr 23;22(9):3240. doi: 10.3390/s22093240.
4
Augmented Humanity: A Systematic Mapping Review.增强人类:系统映射综述。
Sensors (Basel). 2022 Jan 10;22(2):514. doi: 10.3390/s22020514.
Sensors (Basel). 2018 May 10;18(5):1506. doi: 10.3390/s18051506.
4
Visual tracking using neuromorphic asynchronous event-based cameras.使用神经形态异步基于事件相机的视觉跟踪。
Neural Comput. 2015 Apr;27(4):925-53. doi: 10.1162/NECO_a_00720. Epub 2015 Feb 24.
5
Asynchronous Event-Based Multikernel Algorithm for High-Speed Visual Features Tracking.基于异步事件的多核算法在高速视觉特征跟踪中的应用。
IEEE Trans Neural Netw Learn Syst. 2015 Aug;26(8):1710-20. doi: 10.1109/TNNLS.2014.2352401. Epub 2014 Sep 16.
6
A systematic review of mapping strategies for the sonification of physical quantities.对物理量可听化映射策略的系统综述。
PLoS One. 2013 Dec 17;8(12):e82491. doi: 10.1371/journal.pone.0082491. eCollection 2013.
7
An experimental system for auditory image representations.一种用于听觉图像表征的实验系统。
IEEE Trans Biomed Eng. 1992 Feb;39(2):112-21. doi: 10.1109/10.121642.