• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

最小感知:实现资源受限机器人的自主性

Minimal perception: enabling autonomy in resource-constrained robots.

作者信息

Singh Chahat Deep, He Botao, Fermüller Cornelia, Metzler Christopher, Aloimonos Yiannis

机构信息

Perception and Robotics Group, Department of Computer Science, University of Maryland, College Park, MD, United States.

Perception, Robotics, AI and Sensing (PRAISe) Lab, Department of Mechanical Engineering, University of Colorado, Boulder, CO, United States.

出版信息

Front Robot AI. 2024 Sep 18;11:1431826. doi: 10.3389/frobt.2024.1431826. eCollection 2024.

DOI:10.3389/frobt.2024.1431826
PMID:39360225
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC11444933/
Abstract

The rapidly increasing capabilities of autonomous mobile robots promise to make them ubiquitous in the coming decade. These robots will continue to enhance efficiency and safety in novel applications such as disaster management, environmental monitoring, bridge inspection, and agricultural inspection. To operate autonomously without constant human intervention, even in remote or hazardous areas, robots must sense, process, and interpret environmental data using only onboard sensing and computation. This capability is made possible by advancements in perception algorithms, allowing these robots to rely primarily on their perception capabilities for navigation tasks. However, tiny robot autonomy is hindered mainly by sensors, memory, and computing due to size, area, weight, and power constraints. The bottleneck in these robots lies in the real-time perception in resource-constrained robots. To enable autonomy in robots of sizes that are less than 100 mm in body length, we draw inspiration from tiny organisms such as insects and hummingbirds, known for their sophisticated perception, navigation, and survival abilities despite their minimal sensor and neural system. This work aims to provide insights into designing a compact and efficient minimal perception framework for tiny autonomous robots from higher cognitive to lower sensor levels.

摘要

自主移动机器人能力的迅速提升有望在未来十年使其无处不在。这些机器人将继续在诸如灾害管理、环境监测、桥梁检测和农业检测等新应用中提高效率和安全性。为了在没有持续人工干预的情况下自主运行,即使是在偏远或危险区域,机器人必须仅使用机载传感和计算来感知、处理和解释环境数据。感知算法的进步使这种能力成为可能,使这些机器人在导航任务中主要依赖其感知能力。然而,由于尺寸、面积、重量和功率限制,微型机器人的自主性主要受到传感器、内存和计算能力的阻碍。这些机器人的瓶颈在于资源受限机器人的实时感知。为了实现体长小于100毫米的机器人的自主性,我们从昆虫和蜂鸟等微小生物身上汲取灵感,它们以其复杂的感知、导航和生存能力而闻名,尽管它们的传感器和神经系统极为简单。这项工作旨在为从高级认知到低级传感器层面的微型自主机器人设计一个紧凑且高效的最小感知框架提供见解。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/66d1/11444933/eae2fadebc53/frobt-11-1431826-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/66d1/11444933/c4b010670c69/frobt-11-1431826-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/66d1/11444933/4e0655efd02b/frobt-11-1431826-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/66d1/11444933/63c469c7189f/frobt-11-1431826-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/66d1/11444933/7cc3a9318b5c/frobt-11-1431826-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/66d1/11444933/fde0a5ebac15/frobt-11-1431826-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/66d1/11444933/6dfa3493da56/frobt-11-1431826-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/66d1/11444933/a63156611acf/frobt-11-1431826-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/66d1/11444933/e108f17fa9b7/frobt-11-1431826-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/66d1/11444933/eae2fadebc53/frobt-11-1431826-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/66d1/11444933/c4b010670c69/frobt-11-1431826-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/66d1/11444933/4e0655efd02b/frobt-11-1431826-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/66d1/11444933/63c469c7189f/frobt-11-1431826-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/66d1/11444933/7cc3a9318b5c/frobt-11-1431826-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/66d1/11444933/fde0a5ebac15/frobt-11-1431826-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/66d1/11444933/6dfa3493da56/frobt-11-1431826-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/66d1/11444933/a63156611acf/frobt-11-1431826-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/66d1/11444933/e108f17fa9b7/frobt-11-1431826-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/66d1/11444933/eae2fadebc53/frobt-11-1431826-g009.jpg

相似文献

1
Minimal perception: enabling autonomy in resource-constrained robots.最小感知:实现资源受限机器人的自主性
Front Robot AI. 2024 Sep 18;11:1431826. doi: 10.3389/frobt.2024.1431826. eCollection 2024.
2
Insect-inspired AI for autonomous robots.受昆虫启发的人工智能用于自主机器人。
Sci Robot. 2022 Jun 15;7(67):eabl6334. doi: 10.1126/scirobotics.abl6334.
3
Toward Fully Automated Inspection of Critical Assets Supported by Autonomous Mobile Robots, Vision Sensors, and Artificial Intelligence.迈向由自主移动机器人、视觉传感器和人工智能支持的关键资产全自动检测
Sensors (Basel). 2024 Jun 7;24(12):3721. doi: 10.3390/s24123721.
4
Ajna: Generalized deep uncertainty for minimal perception on parsimonious robots.阿佳娜:简约机器人最小感知的广义深度不确定性。
Sci Robot. 2023 Aug 16;8(81):eadd5139. doi: 10.1126/scirobotics.add5139.
5
A Sensor Fusion Method for Pose Estimation of C-Legged Robots.一种用于C型腿机器人姿态估计的传感器融合方法。
Sensors (Basel). 2020 Nov 25;20(23):6741. doi: 10.3390/s20236741.
6
The Synthetic Moth: A Neuromorphic Approach toward Artificial Olfaction in Robots合成蛾:一种用于机器人人工嗅觉的神经形态方法
7
Visual route following for tiny autonomous robots.微型自主机器人的视觉路径跟踪
Sci Robot. 2024 Jul 17;9(92):eadk0310. doi: 10.1126/scirobotics.adk0310.
8
A Literature Review on Safety Perception and Trust during Human-Robot Interaction with Autonomous Mobile Robots That Apply to Industrial Environments.关于在适用于工业环境的自主移动机器人的人机交互中,安全感知和信任的文献综述。
IISE Trans Occup Ergon Hum Factors. 2024 Jan-Jun;12(1-2):6-27. doi: 10.1080/24725838.2023.2283537. Epub 2024 Jan 8.
9
Reactive and Cognitive Search Strategies for Olfactory Robots嗅觉机器人的反应式与认知式搜索策略
10
Coordinated Navigation of Two Agricultural Robots in a Vineyard: A Simulation Study.两个农业机器人在葡萄园的协调导航:模拟研究。
Sensors (Basel). 2022 Nov 23;22(23):9095. doi: 10.3390/s22239095.

本文引用的文献

1
Microsaccade-inspired event camera for robotics.基于微扫视启发的机器人事件相机。
Sci Robot. 2024 May 29;9(90):eadj8124. doi: 10.1126/scirobotics.adj8124.
2
Ajna: Generalized deep uncertainty for minimal perception on parsimonious robots.阿佳娜:简约机器人最小感知的广义深度不确定性。
Sci Robot. 2023 Aug 16;8(81):eadd5139. doi: 10.1126/scirobotics.add5139.
3
Active fixation as an efficient coding strategy for neuromorphic vision.主动固定作为神经形态视觉的一种有效编码策略。
Sci Rep. 2023 May 8;13(1):7445. doi: 10.1038/s41598-023-34508-x.
4
Investigating deep optics model representation in affecting resolved all-in-focus image quality and depth estimation fidelity.研究深度光学模型表示对影响全聚焦图像质量和深度估计保真度的作用。
Opt Express. 2022 Sep 26;30(20):36973-36984. doi: 10.1364/OE.473084.
5
Active Mapping and Robot Exploration: A Survey.主动映射与机器人探索:综述
Sensors (Basel). 2021 Apr 2;21(7):2445. doi: 10.3390/s21072445.
6
Dynamic obstacle avoidance for quadrotors with event cameras.四旋翼飞行器的事件相机动态避障。
Sci Robot. 2020 Mar 18;5(40). doi: 10.1126/scirobotics.aaz9712.
7
Towards Robust Monocular Depth Estimation: Mixing Datasets for Zero-Shot Cross-Dataset Transfer.迈向稳健的单目深度估计:混合数据集以实现零样本跨数据集迁移。
IEEE Trans Pattern Anal Mach Intell. 2022 Mar;44(3):1623-1637. doi: 10.1109/TPAMI.2020.3019967. Epub 2022 Feb 3.
8
Revisiting active perception.重新审视主动感知。
Auton Robots. 2018;42(2):177-196. doi: 10.1007/s10514-017-9615-3. Epub 2017 Feb 15.
9
Learned phase coded aperture for the benefit of depth of field extension.用于景深扩展的学习型相位编码孔径。
Opt Express. 2018 Jun 11;26(12):15316-15331. doi: 10.1364/OE.26.015316.
10
The unsteady eye: an information-processing stage, not a bug.不稳定的眼睛:一个信息处理阶段,而非缺陷。
Trends Neurosci. 2015 Apr;38(4):195-206. doi: 10.1016/j.tins.2015.01.005. Epub 2015 Feb 16.