• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

基于异步视觉事件的到达时间估计。

Asynchronous visual event-based time-to-contact.

机构信息

Vision Institute, Universitée Pierre et Marie Curie, UMR S968 Inserm, UPMC, CNRS UMR 7210, CHNO des Quinze-Vingts Paris, France.

Vision Institute, Universitée Pierre et Marie Curie, UMR S968 Inserm, UPMC, CNRS UMR 7210, CHNO des Quinze-Vingts Paris, France ; iCub Facility, Istituto Italiano di Tecnologia Genova, Italia.

出版信息

Front Neurosci. 2014 Feb 7;8:9. doi: 10.3389/fnins.2014.00009. eCollection 2014.

DOI:10.3389/fnins.2014.00009
PMID:24570652
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC3916774/
Abstract

Reliable and fast sensing of the environment is a fundamental requirement for autonomous mobile robotic platforms. Unfortunately, the frame-based acquisition paradigm at the basis of main stream artificial perceptive systems is limited by low temporal dynamics and redundant data flow, leading to high computational costs. Hence, conventional sensing and relative computation are obviously incompatible with the design of high speed sensor-based reactive control for mobile applications, that pose strict limits on energy consumption and computational load. This paper introduces a fast obstacle avoidance method based on the output of an asynchronous event-based time encoded imaging sensor. The proposed method relies on an event-based Time To Contact (TTC) computation based on visual event-based motion flows. The approach is event-based in the sense that every incoming event adds to the computation process thus allowing fast avoidance responses. The method is validated indoor on a mobile robot, comparing the event-based TTC with a laser range finder TTC, showing that event-based sensing offers new perspectives for mobile robotics sensing.

摘要

可靠和快速的环境感知是自主移动机器人平台的基本要求。不幸的是,主流人工智能感知系统所基于的基于帧的采集范式受到低时间动态和冗余数据流的限制,导致计算成本高。因此,传统的传感和相关计算显然与基于传感器的高速反应式控制设计不兼容,后者对移动应用有严格的能耗和计算负载限制。本文介绍了一种基于异步事件驱动时间编码成像传感器输出的快速避障方法。所提出的方法依赖于基于视觉事件驱动运动流的基于事件的时距(TTC)计算。该方法在事件方面的意义是,每个输入事件都会增加计算过程,从而允许快速避障响应。该方法在室内的移动机器人上进行了验证,将基于事件的 TTC 与激光测距仪 TTC 进行了比较,表明基于事件的传感为移动机器人传感提供了新的视角。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/50b3/3916774/c382db628b6d/fnins-08-00009-g0010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/50b3/3916774/234e34dd7488/fnins-08-00009-g0001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/50b3/3916774/ca31562e1217/fnins-08-00009-g0002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/50b3/3916774/c721560d307a/fnins-08-00009-g0003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/50b3/3916774/597e5700d9ff/fnins-08-00009-g0004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/50b3/3916774/44a19ff17595/fnins-08-00009-g0005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/50b3/3916774/cfed57fc0260/fnins-08-00009-g0006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/50b3/3916774/cb8786dbd53c/fnins-08-00009-g0007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/50b3/3916774/6d4d0da22c09/fnins-08-00009-g0008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/50b3/3916774/88fe01fa0e81/fnins-08-00009-g0009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/50b3/3916774/c382db628b6d/fnins-08-00009-g0010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/50b3/3916774/234e34dd7488/fnins-08-00009-g0001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/50b3/3916774/ca31562e1217/fnins-08-00009-g0002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/50b3/3916774/c721560d307a/fnins-08-00009-g0003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/50b3/3916774/597e5700d9ff/fnins-08-00009-g0004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/50b3/3916774/44a19ff17595/fnins-08-00009-g0005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/50b3/3916774/cfed57fc0260/fnins-08-00009-g0006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/50b3/3916774/cb8786dbd53c/fnins-08-00009-g0007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/50b3/3916774/6d4d0da22c09/fnins-08-00009-g0008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/50b3/3916774/88fe01fa0e81/fnins-08-00009-g0009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/50b3/3916774/c382db628b6d/fnins-08-00009-g0010.jpg

相似文献

1
Asynchronous visual event-based time-to-contact.基于异步视觉事件的到达时间估计。
Front Neurosci. 2014 Feb 7;8:9. doi: 10.3389/fnins.2014.00009. eCollection 2014.
2
Asynchronous frameless event-based optical flow.异步无框架基于事件的光流。
Neural Netw. 2012 Mar;27:32-7. doi: 10.1016/j.neunet.2011.11.001. Epub 2011 Nov 25.
3
Event-Based Line Fitting and Segment Detection Using a Neuromorphic Visual Sensor.基于事件的线拟合与线段检测:使用神经形态视觉传感器
IEEE Trans Neural Netw Learn Syst. 2018 Sep 12. doi: 10.1109/TNNLS.2018.2807983.
4
Asynchronous Event-Based Fourier Analysis.基于异步事件的傅里叶分析。
IEEE Trans Image Process. 2017 May;26(5):2192-2202. doi: 10.1109/TIP.2017.2661702. Epub 2017 Feb 6.
5
Neuromorphic Event-Based 3D Pose Estimation.基于神经形态事件的3D姿态估计
Front Neurosci. 2016 Jan 22;9:522. doi: 10.3389/fnins.2015.00522. eCollection 2015.
6
Robotic goalie with 3 ms reaction time at 4% CPU load using event-based dynamic vision sensor.基于事件的动态视觉传感器,机器人守门员反应时间 3 毫秒,CPU 负载 4%。
Front Neurosci. 2013 Nov 21;7:223. doi: 10.3389/fnins.2013.00223. eCollection 2013.
7
Event-Based Robotic Grasping Detection With Neuromorphic Vision Sensor and Event-Grasping Dataset.基于事件的机器人抓取检测与神经形态视觉传感器及事件抓取数据集
Front Neurorobot. 2020 Oct 8;14:51. doi: 10.3389/fnbot.2020.00051. eCollection 2020.
8
Dynamic obstacle avoidance for quadrotors with event cameras.四旋翼飞行器的事件相机动态避障。
Sci Robot. 2020 Mar 18;5(40). doi: 10.1126/scirobotics.aaz9712.
9
Event-based visual flow.基于事件的视觉流。
IEEE Trans Neural Netw Learn Syst. 2014 Feb;25(2):407-17. doi: 10.1109/TNNLS.2013.2273537.
10
Bio-inspired vision based robot control using featureless estimations of time-to-contact.基于生物启发视觉的机器人控制,使用无特征的接触时间估计。
Bioinspir Biomim. 2017 Jan 31;12(2):025001. doi: 10.1088/1748-3190/aa53c4.

引用本文的文献

1
Precise Spiking Motifs in Neurobiological and Neuromorphic Data.神经生物学和神经形态数据中的精确尖峰模式
Brain Sci. 2022 Dec 29;13(1):68. doi: 10.3390/brainsci13010068.
2
Event-Based Sensing and Signal Processing in the Visual, Auditory, and Olfactory Domain: A Review.基于事件的视觉、听觉和嗅觉域传感与信号处理:综述
Front Neural Circuits. 2021 May 31;15:610446. doi: 10.3389/fncir.2021.610446. eCollection 2021.
3
Approaching Retinal Ganglion Cell Modeling and FPGA Implementation for Robotics.面向机器人技术的视网膜神经节细胞建模与现场可编程门阵列实现

本文引用的文献

1
Event-based visual flow.基于事件的视觉流。
IEEE Trans Neural Netw Learn Syst. 2014 Feb;25(2):407-17. doi: 10.1109/TNNLS.2013.2273537.
2
A comprehensive workflow for general-purpose neural modeling with highly configurable neuromorphic hardware systems.一种用于通用神经建模的综合工作流程,采用高度可配置的神经形态硬件系统。
Biol Cybern. 2011 May;104(4-5):263-96. doi: 10.1007/s00422-011-0435-9. Epub 2011 May 27.
3
Collision detection in complex dynamic scenes using an LGMD-based visual neural network with feature enhancement.
Entropy (Basel). 2018 Jun 19;20(6):475. doi: 10.3390/e20060475.
4
Hough Transform Implementation For Event-Based Systems: Concepts and Challenges.基于事件系统的霍夫变换实现:概念与挑战
Front Comput Neurosci. 2018 Dec 21;12:103. doi: 10.3389/fncom.2018.00103. eCollection 2018.
5
A Motion-Based Feature for Event-Based Pattern Recognition.一种用于基于事件的模式识别的基于运动的特征。
Front Neurosci. 2017 Jan 4;10:594. doi: 10.3389/fnins.2016.00594. eCollection 2016.
6
An Event-Based Solution to the Perspective-n-Point Problem.基于事件的透视n点问题解决方案。
Front Neurosci. 2016 May 18;10:208. doi: 10.3389/fnins.2016.00208. eCollection 2016.
7
Event-Based Computation of Motion Flow on a Neuromorphic Analog Neural Platform.基于事件的神经形态模拟神经平台上运动流的计算
Front Neurosci. 2016 Feb 16;10:35. doi: 10.3389/fnins.2016.00035. eCollection 2016.
8
Research topic: neuromorphic engineering systems and applications. A snapshot of neuromorphic systems engineering.研究主题:神经形态工程系统与应用。神经形态系统工程概述。
Front Neurosci. 2014 Dec 19;8:424. doi: 10.3389/fnins.2014.00424. eCollection 2014.
基于具有特征增强功能的LGMD视觉神经网络的复杂动态场景中的碰撞检测
IEEE Trans Neural Netw. 2006 May;17(3):705-16. doi: 10.1109/TNN.2006.873286.
4
Seeing what is coming: building collision-sensitive neurones.预见未来之事:构建碰撞敏感神经元。
Trends Neurosci. 1999 May;22(5):215-20. doi: 10.1016/s0166-2236(98)01332-0.
5
A theory of visual control of braking based on information about time-to-collision.一种基于碰撞时间信息的视觉制动控制理论。
Perception. 1976;5(4):437-59. doi: 10.1068/p050437.