• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

触觉图:一种基于神经形态视觉触觉传感的用于接触角预测的异步图神经网络。

TactiGraph: An Asynchronous Graph Neural Network for Contact Angle Prediction Using Neuromorphic Vision-Based Tactile Sensing.

作者信息

Sajwani Hussain, Ayyad Abdulla, Alkendi Yusra, Halwani Mohamad, Abdulrahman Yusra, Abusafieh Abdulqader, Zweiri Yahya

机构信息

UAE National Service & Reserve Authority, Abu Dhabi, United Arab Emirates.

Advanced Research and Innovation Center (ARIC), Khalifa University, Abu Dhabi 127788, United Arab Emirates.

出版信息

Sensors (Basel). 2023 Jul 17;23(14):6451. doi: 10.3390/s23146451.

DOI:10.3390/s23146451
PMID:37514745
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC10383597/
Abstract

Vision-based tactile sensors (VBTSs) have become the de facto method for giving robots the ability to obtain tactile feedback from their environment. Unlike other solutions to tactile sensing, VBTSs offer high spatial resolution feedback without compromising on instrumentation costs or incurring additional maintenance expenses. However, conventional cameras used in VBTS have a fixed update rate and output redundant data, leading to computational overhead.In this work, we present a neuromorphic vision-based tactile sensor (N-VBTS) that employs observations from an event-based camera for contact angle prediction. In particular, we design and develop a novel graph neural network, dubbed TactiGraph, that asynchronously operates on graphs constructed from raw N-VBTS streams exploiting their spatiotemporal correlations to perform predictions. Although conventional VBTSs use an internal illumination source, TactiGraph is reported to perform efficiently in both scenarios (with and without an internal illumination source) thus further reducing instrumentation costs. Rigorous experimental results revealed that TactiGraph achieved a mean absolute error of 0.62∘ in predicting the contact angle and was faster and more efficient than both conventional VBTS and other N-VBTS, with lower instrumentation costs. Specifically, N-VBTS requires only 5.5% of the computing time needed by VBTS when both are tested on the same scenario.

摘要

基于视觉的触觉传感器(VBTS)已成为赋予机器人从其环境中获取触觉反馈能力的实际方法。与其他触觉传感解决方案不同,VBTS提供高空间分辨率反馈,同时不会增加仪器成本或产生额外维护费用。然而,VBTS中使用的传统相机具有固定的更新速率并输出冗余数据,从而导致计算开销。在这项工作中,我们提出了一种基于神经形态视觉的触觉传感器(N-VBTS),它利用基于事件的相机的观测结果进行接触角预测。具体而言,我们设计并开发了一种新颖的图神经网络,称为TactiGraph,它对从原始N-VBTS流构建的图进行异步操作,利用其时空相关性来执行预测。尽管传统的VBTS使用内部照明源,但据报道TactiGraph在两种情况下(有和没有内部照明源)都能高效运行,从而进一步降低了仪器成本。严格的实验结果表明,TactiGraph在预测接触角时的平均绝对误差为0.62°,并且比传统VBTS和其他N-VBTS更快、更高效,仪器成本更低。具体来说,当在相同场景下进行测试时,N-VBTS所需的计算时间仅为VBTS的5.5%。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/da86/10383597/5ba713fec273/sensors-23-06451-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/da86/10383597/b877f953ca7c/sensors-23-06451-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/da86/10383597/78e3561c75fb/sensors-23-06451-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/da86/10383597/8486ca6951cd/sensors-23-06451-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/da86/10383597/f556125f3e0c/sensors-23-06451-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/da86/10383597/5e10416aa3ac/sensors-23-06451-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/da86/10383597/80211ef11553/sensors-23-06451-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/da86/10383597/5ba713fec273/sensors-23-06451-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/da86/10383597/b877f953ca7c/sensors-23-06451-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/da86/10383597/78e3561c75fb/sensors-23-06451-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/da86/10383597/8486ca6951cd/sensors-23-06451-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/da86/10383597/f556125f3e0c/sensors-23-06451-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/da86/10383597/5e10416aa3ac/sensors-23-06451-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/da86/10383597/80211ef11553/sensors-23-06451-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/da86/10383597/5ba713fec273/sensors-23-06451-g007.jpg

相似文献

1
TactiGraph: An Asynchronous Graph Neural Network for Contact Angle Prediction Using Neuromorphic Vision-Based Tactile Sensing.触觉图:一种基于神经形态视觉触觉传感的用于接触角预测的异步图神经网络。
Sensors (Basel). 2023 Jul 17;23(14):6451. doi: 10.3390/s23146451.
2
Design and Evaluation of a Rapid Monolithic Manufacturing Technique for a Novel Vision-Based Tactile Sensor: C-Sight.一种新型视觉触觉传感器C-Sight的快速整体制造技术的设计与评估
Sensors (Basel). 2024 Jul 16;24(14):4603. doi: 10.3390/s24144603.
3
Multidimensional Tactile Sensor with a Thin Compound Eye-Inspired Imaging System.具有薄型复眼启发式成像系统的多维触觉传感器。
Soft Robot. 2022 Oct;9(5):861-870. doi: 10.1089/soro.2020.0202. Epub 2021 Oct 7.
4
VT-SGN:Spiking Graph Neural Network for Neuromorphic Visual-Tactile Fusion.VT-SGN:用于神经形态视觉-触觉融合的脉冲图神经网络
IEEE Trans Haptics. 2024 Sep 27;PP. doi: 10.1109/TOH.2024.3449411.
5
GelSight: High-Resolution Robot Tactile Sensors for Estimating Geometry and Force.凝胶视觉:用于估计几何形状和力的高分辨率机器人触觉传感器。
Sensors (Basel). 2017 Nov 29;17(12):2762. doi: 10.3390/s17122762.
6
Event-Based Robotic Grasping Detection With Neuromorphic Vision Sensor and Event-Grasping Dataset.基于事件的机器人抓取检测与神经形态视觉传感器及事件抓取数据集
Front Neurorobot. 2020 Oct 8;14:51. doi: 10.3389/fnbot.2020.00051. eCollection 2020.
7
Fully neuromorphic vision and control for autonomous drone flight.用于自主无人机飞行的全神经形态视觉与控制。
Sci Robot. 2024 May 15;9(90):eadi0591. doi: 10.1126/scirobotics.adi0591.
8
Sim-to-Real for High-Resolution Optical Tactile Sensing: From Images to Three-Dimensional Contact Force Distributions.高分辨率光触觉传感的仿真到真实:从图像到三维接触力分布。
Soft Robot. 2022 Oct;9(5):926-937. doi: 10.1089/soro.2020.0213. Epub 2021 Nov 25.
9
Neuromorphic Vision Based Contact-Level Classification in Robotic Grasping Applications.基于神经形态视觉的机器人抓取应用中的接触级分类。
Sensors (Basel). 2020 Aug 21;20(17):4724. doi: 10.3390/s20174724.
10
Transfer of Learning from Vision to Touch: A Hybrid Deep Convolutional Neural Network for Visuo-Tactile 3D Object Recognition.从视觉到触觉的迁移学习:用于视触 3D 物体识别的混合深度卷积神经网络。
Sensors (Basel). 2020 Dec 27;21(1):113. doi: 10.3390/s21010113.

引用本文的文献

1
Vision-guided robotic system for aero-engine inspection and dynamic balancing.用于航空发动机检查和动平衡的视觉引导机器人系统。
Sci Rep. 2024 Dec 28;14(1):30742. doi: 10.1038/s41598-024-80540-w.

本文引用的文献

1
Opportunities for neuromorphic computing algorithms and applications.神经形态计算算法与应用的机遇。
Nat Comput Sci. 2022 Jan;2(1):10-19. doi: 10.1038/s43588-021-00184-y. Epub 2022 Jan 31.
2
Elastomer-Based Visuotactile Sensor for Normality of Robotic Manufacturing Systems.用于机器人制造系统正常运行的基于弹性体的视觉触觉传感器。
Polymers (Basel). 2022 Nov 24;14(23):5097. doi: 10.3390/polym14235097.
3
Proprioception and Exteroception of a Soft Robotic Finger Using Neuromorphic Vision-Based Sensing.基于神经形态视觉传感的软体机器手指的本体感受和外感受。
Soft Robot. 2023 Jun;10(3):467-481. doi: 10.1089/soro.2022.0030. Epub 2022 Oct 14.
4
Neuromorphic Tactile Edge Orientation Classification in an Unsupervised Spiking Neural Network.无监督尖峰神经网络中的神经形态触觉边缘方向分类。
Sensors (Basel). 2022 Sep 15;22(18):6998. doi: 10.3390/s22186998.
5
Neuromorphic Camera Denoising Using Graph Neural Network-Driven Transformers.基于图神经网络驱动的变压器的神经形态相机去噪
IEEE Trans Neural Netw Learn Syst. 2024 Mar;35(3):4110-4124. doi: 10.1109/TNNLS.2022.3201830. Epub 2024 Feb 29.
6
HiVTac: A High-Speed Vision-Based Tactile Sensor for Precise and Real-Time Force Reconstruction with Fewer Markers.HiVTac:一种基于高速视觉的触觉传感器,具有较少标记,可实现精确和实时的力重建。
Sensors (Basel). 2022 May 31;22(11):4196. doi: 10.3390/s22114196.
7
Artificial SA-I, RA-I and RA-II/vibrotactile afferents for tactile sensing of texture.用于纹理触觉感知的人工 SA-I、RA-I 和 RA-II/振动觉传入。
J R Soc Interface. 2022 Apr;19(189):20210603. doi: 10.1098/rsif.2021.0603. Epub 2022 Apr 6.
8
Low Cost and Latency Event Camera Background Activity Denoising.低成本与低延迟事件相机背景活动去噪
IEEE Trans Pattern Anal Mach Intell. 2023 Jan;45(1):785-795. doi: 10.1109/TPAMI.2022.3152999. Epub 2022 Dec 5.
9
Detecting the normal-direction in automated aircraft manufacturing based on adaptive alignment.基于自适应对准的自动飞机制造中法线方向检测
Sci Prog. 2020 Oct-Dec;103(4):36850420981212. doi: 10.1177/0036850420981212.
10
Neuromorphic Vision Based Contact-Level Classification in Robotic Grasping Applications.基于神经形态视觉的机器人抓取应用中的接触级分类。
Sensors (Basel). 2020 Aug 21;20(17):4724. doi: 10.3390/s20174724.