• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

用于室内导航和日常活动的视觉障碍空间感知系统。

Visual Impairment Spatial Awareness System for Indoor Navigation and Daily Activities.

作者信息

Yu Xinrui, Saniie Jafar

机构信息

Department of Electrical and Computer Engineering, Illinois Institute of Technology, Chicago, IL 60616, USA.

出版信息

J Imaging. 2025 Jan 4;11(1):9. doi: 10.3390/jimaging11010009.

DOI:10.3390/jimaging11010009
PMID:39852322
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC11766877/
Abstract

The integration of artificial intelligence into daily life significantly enhances the autonomy and quality of life of visually impaired individuals. This paper introduces the Visual Impairment Spatial Awareness (VISA) system, designed to holistically assist visually impaired users in indoor activities through a structured, multi-level approach. At the foundational level, the system employs augmented reality (AR) markers for indoor positioning, neural networks for advanced object detection and tracking, and depth information for precise object localization. At the intermediate level, it integrates data from these technologies to aid in complex navigational tasks such as obstacle avoidance and pathfinding. The advanced level synthesizes these capabilities to enhance spatial awareness, enabling users to navigate complex environments and locate specific items. The VISA system exhibits an efficient human-machine interface (HMI), incorporating text-to-speech and speech-to-text technologies for natural and intuitive communication. Evaluations in simulated real-world environments demonstrate that the system allows users to interact naturally and with minimal effort. Our experimental results confirm that the VISA system efficiently assists visually impaired users in indoor navigation, object detection and localization, and label and text recognition, thereby significantly enhancing their spatial awareness and independence.

摘要

将人工智能融入日常生活可显著提高视障人士的自主性和生活质量。本文介绍了视障空间感知(VISA)系统,该系统旨在通过结构化的多层次方法,全面协助视障用户进行室内活动。在基础层面,该系统采用增强现实(AR)标记进行室内定位,利用神经网络进行高级目标检测和跟踪,并使用深度信息进行精确的目标定位。在中间层面,它整合这些技术的数据,以协助完成诸如避障和路径查找等复杂的导航任务。高级层面综合这些能力以增强空间感知,使用户能够在复杂环境中导航并找到特定物品。VISA系统展示了一个高效的人机界面(HMI),它结合了文本转语音和语音转文本技术,实现自然且直观的交流。在模拟真实环境中的评估表明,该系统允许用户自然地进行交互,且付出的努力最小。我们的实验结果证实,VISA系统有效地协助视障用户进行室内导航、目标检测与定位以及标签和文本识别,从而显著提高他们的空间感知和独立性。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0a92/11766877/4b340177d9e2/jimaging-11-00009-g024.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0a92/11766877/bf9fbe87552a/jimaging-11-00009-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0a92/11766877/6427c461d2cc/jimaging-11-00009-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0a92/11766877/4f46e146c676/jimaging-11-00009-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0a92/11766877/cf28679b48f8/jimaging-11-00009-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0a92/11766877/ec48c3d9f0fd/jimaging-11-00009-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0a92/11766877/bd2824453915/jimaging-11-00009-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0a92/11766877/63636b789993/jimaging-11-00009-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0a92/11766877/8063550f4978/jimaging-11-00009-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0a92/11766877/b6f50d7c0f46/jimaging-11-00009-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0a92/11766877/d2e3149da50d/jimaging-11-00009-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0a92/11766877/f0ab1eb93318/jimaging-11-00009-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0a92/11766877/77a7218bc01e/jimaging-11-00009-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0a92/11766877/56082dd40c28/jimaging-11-00009-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0a92/11766877/16428ab25c3a/jimaging-11-00009-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0a92/11766877/3bcd0d1c2993/jimaging-11-00009-g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0a92/11766877/51b9e8f9ae5c/jimaging-11-00009-g016.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0a92/11766877/2ab17cc3efae/jimaging-11-00009-g017.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0a92/11766877/cb69700101ee/jimaging-11-00009-g018.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0a92/11766877/7abee950d505/jimaging-11-00009-g019.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0a92/11766877/61704deae89e/jimaging-11-00009-g020.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0a92/11766877/3490063bcccb/jimaging-11-00009-g021.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0a92/11766877/526bfae73ee3/jimaging-11-00009-g022.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0a92/11766877/06e9946e6a3a/jimaging-11-00009-g023.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0a92/11766877/4b340177d9e2/jimaging-11-00009-g024.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0a92/11766877/bf9fbe87552a/jimaging-11-00009-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0a92/11766877/6427c461d2cc/jimaging-11-00009-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0a92/11766877/4f46e146c676/jimaging-11-00009-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0a92/11766877/cf28679b48f8/jimaging-11-00009-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0a92/11766877/ec48c3d9f0fd/jimaging-11-00009-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0a92/11766877/bd2824453915/jimaging-11-00009-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0a92/11766877/63636b789993/jimaging-11-00009-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0a92/11766877/8063550f4978/jimaging-11-00009-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0a92/11766877/b6f50d7c0f46/jimaging-11-00009-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0a92/11766877/d2e3149da50d/jimaging-11-00009-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0a92/11766877/f0ab1eb93318/jimaging-11-00009-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0a92/11766877/77a7218bc01e/jimaging-11-00009-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0a92/11766877/56082dd40c28/jimaging-11-00009-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0a92/11766877/16428ab25c3a/jimaging-11-00009-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0a92/11766877/3bcd0d1c2993/jimaging-11-00009-g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0a92/11766877/51b9e8f9ae5c/jimaging-11-00009-g016.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0a92/11766877/2ab17cc3efae/jimaging-11-00009-g017.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0a92/11766877/cb69700101ee/jimaging-11-00009-g018.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0a92/11766877/7abee950d505/jimaging-11-00009-g019.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0a92/11766877/61704deae89e/jimaging-11-00009-g020.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0a92/11766877/3490063bcccb/jimaging-11-00009-g021.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0a92/11766877/526bfae73ee3/jimaging-11-00009-g022.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0a92/11766877/06e9946e6a3a/jimaging-11-00009-g023.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0a92/11766877/4b340177d9e2/jimaging-11-00009-g024.jpg

相似文献

1
Visual Impairment Spatial Awareness System for Indoor Navigation and Daily Activities.用于室内导航和日常活动的视觉障碍空间感知系统。
J Imaging. 2025 Jan 4;11(1):9. doi: 10.3390/jimaging11010009.
2
6-DOF Pose Estimation of a Robotic Navigation Aid by Tracking Visual and Geometric Features.通过跟踪视觉和几何特征实现机器人导航辅助设备的六自由度姿态估计
IEEE Trans Autom Sci Eng. 2015 Oct;12(4):1169-1180. doi: 10.1109/TASE.2015.2469726. Epub 2015 Oct 5.
3
A Navigation and Augmented Reality System for Visually Impaired People.为视障人士设计的导航和增强现实系统。
Sensors (Basel). 2021 Apr 28;21(9):3061. doi: 10.3390/s21093061.
4
Vision-based Mobile Indoor Assistive Navigation Aid for Blind People.面向盲人的基于视觉的移动室内辅助导航工具
IEEE Trans Mob Comput. 2019 Mar;18(3):702-714. doi: 10.1109/TMC.2018.2842751. Epub 2018 Jun 1.
5
Deep Learning-Based Positioning of Visually Impaired People in Indoor Environments.基于深度学习的视障人士室内定位。
Sensors (Basel). 2020 Oct 31;20(21):6238. doi: 10.3390/s20216238.
6
An indoor navigation system for the visually impaired.视障人士室内导航系统。
Sensors (Basel). 2012;12(6):8236-58. doi: 10.3390/s120608236. Epub 2012 Jun 13.
7
Integrating Wearable Haptics and Obstacle Avoidance for the Visually Impaired in Indoor Navigation: A User-Centered Approach.将可穿戴触觉技术与障碍物规避技术整合应用于视障者室内导航:一种以用户为中心的方法。
IEEE Trans Haptics. 2021 Jan-Mar;14(1):109-122. doi: 10.1109/TOH.2020.2996748. Epub 2021 Mar 24.
8
Haptics-based, higher-order sensory substitution designed for object negotiation in blindness and low vision: Virtual Whiskers.基于触觉的高阶感官替代,专为盲人和低视力者的物体识别设计:虚拟触须
Disabil Rehabil Assist Technol. 2025 Feb 21:1-20. doi: 10.1080/17483107.2025.2458112.
9
Assistive Navigation Using Deep Reinforcement Learning Guiding Robot With UWB/Voice Beacons and Semantic Feedbacks for Blind and Visually Impaired People.使用深度强化学习的辅助导航:借助超宽带/语音信标和语义反馈引导机器人为盲人和视力障碍者服务
Front Robot AI. 2021 Jun 22;8:654132. doi: 10.3389/frobt.2021.654132. eCollection 2021.
10
Technological Advancements in Human Navigation for the Visually Impaired: A Systematic Review.视障人士人类导航技术的进步:一项系统综述
Sensors (Basel). 2025 Apr 1;25(7):2213. doi: 10.3390/s25072213.

引用本文的文献

1
Leveraging assistive technology for visually impaired people through optimal deep transfer learning based object detection model.通过基于最优深度迁移学习的目标检测模型,为视障人士利用辅助技术。
Sci Rep. 2025 Aug 17;15(1):30113. doi: 10.1038/s41598-025-14946-5.

本文引用的文献

1
LiDAR-Based Sensor Fusion SLAM and Localization for Autonomous Driving Vehicles in Complex Scenarios.基于激光雷达的传感器融合SLAM技术及复杂场景下自动驾驶车辆的定位
J Imaging. 2023 Feb 20;9(2):52. doi: 10.3390/jimaging9020052.
2
Review of Navigation Assistive Tools and Technologies for the Visually Impaired.导航辅助工具和技术在视障人群中的应用综述。
Sensors (Basel). 2022 Oct 17;22(20):7888. doi: 10.3390/s22207888.
3
Trends in prevalence of blindness and distance and near vision impairment over 30 years: an analysis for the Global Burden of Disease Study.
30 多年来盲症和远距离及近距离视力损伤流行率的变化趋势:全球疾病负担研究的分析。
Lancet Glob Health. 2021 Feb;9(2):e130-e143. doi: 10.1016/S2214-109X(20)30425-3. Epub 2020 Dec 1.
4
Integrating Wearable Haptics and Obstacle Avoidance for the Visually Impaired in Indoor Navigation: A User-Centered Approach.将可穿戴触觉技术与障碍物规避技术整合应用于视障者室内导航:一种以用户为中心的方法。
IEEE Trans Haptics. 2021 Jan-Mar;14(1):109-122. doi: 10.1109/TOH.2020.2996748. Epub 2021 Mar 24.
5
Vision-based Mobile Indoor Assistive Navigation Aid for Blind People.面向盲人的基于视觉的移动室内辅助导航工具
IEEE Trans Mob Comput. 2019 Mar;18(3):702-714. doi: 10.1109/TMC.2018.2842751. Epub 2018 Jun 1.
6
Visual Impairment and Blindness in Adults in the United States: Demographic and Geographic Variations From 2015 to 2050.美国成年人的视力障碍和失明:2015 年至 2050 年的人口统计学和地理差异。
JAMA Ophthalmol. 2016 Jul 1;134(7):802-9. doi: 10.1001/jamaophthalmol.2016.1284.
7
Design, Implementation and Evaluation of an Indoor Navigation System for Visually Impaired People.视障人士室内导航系统的设计、实现与评估
Sensors (Basel). 2015 Dec 21;15(12):32168-87. doi: 10.3390/s151229912.
8
Design of a Braille Learning Application for Visually Impaired Students in Bangladesh.孟加拉国视障学生盲文学习应用程序的设计
Assist Technol. 2015 Fall;27(3):172-82. doi: 10.1080/10400435.2015.1011758.
9
Vision should not be overlooked as an important sensory modality for finding host plants.视觉作为寻找寄主植物的一种重要感官方式不应被忽视。
Environ Entomol. 2011 Aug;40(4):855-63. doi: 10.1603/EN10212.
10
Ophthalmic and visual profile of guide dog owners in Scotland.苏格兰导盲犬主人的眼科和视觉特征。
Br J Ophthalmol. 1999 Apr;83(4):470-7. doi: 10.1136/bjo.83.4.470.