Suppr超能文献

为资源受限的人工智能系统提供弹性的模型与方法

Model and Method for Providing Resilience to Resource-Constrained AI-System.

作者信息

Moskalenko Viacheslav, Kharchenko Vyacheslav, Semenov Serhii

机构信息

Department of Computer Science, Sumy State University, 116, Kharkivska Str., 40007 Sumy, Ukraine.

Department of Computer Systems, Networks and Cybersecurity, National Aerospace University "KhAI", 17, Chkalov Str., 61070 Kharkiv, Ukraine.

出版信息

Sensors (Basel). 2024 Sep 13;24(18):5951. doi: 10.3390/s24185951.

Abstract

Artificial intelligence technologies are becoming increasingly prevalent in resource-constrained, safety-critical embedded systems. Numerous methods exist to enhance the resilience of AI systems against disruptive influences. However, when resources are limited, ensuring cost-effective resilience becomes crucial. A promising approach for reducing the resource consumption of AI systems during test-time involves applying the concepts and methods of dynamic neural networks. Nevertheless, the resilience of dynamic neural networks against various disturbances remains underexplored. This paper proposes a model architecture and training method that integrate dynamic neural networks with a focus on resilience. Compared to conventional training methods, the proposed approach yields a 24% increase in the resilience of convolutional networks and a 19.7% increase in the resilience of visual transformers under fault injections. Additionally, it results in a 16.9% increase in the resilience of convolutional network ResNet-110 and a 21.6% increase in the resilience of visual transformer DeiT-S under adversarial attacks, while saving more than 30% of computational resources. Meta-training the neural network model improves resilience to task changes by an average of 22%, while achieving the same level of resource savings.

摘要

人工智能技术在资源受限、对安全至关重要的嵌入式系统中越来越普遍。存在许多方法来增强人工智能系统抵御干扰影响的弹性。然而,当资源有限时,确保具有成本效益的弹性就变得至关重要。一种在测试时减少人工智能系统资源消耗的有前途的方法是应用动态神经网络的概念和方法。然而,动态神经网络对各种干扰的弹性仍未得到充分探索。本文提出了一种将动态神经网络与弹性重点相结合的模型架构和训练方法。与传统训练方法相比,所提出的方法在故障注入下使卷积网络的弹性提高了24%,视觉Transformer的弹性提高了19.7%。此外,在对抗攻击下,它使卷积网络ResNet-110的弹性提高了16.9%,视觉Transformer DeiT-S的弹性提高了21.6%,同时节省了超过30%的计算资源。对神经网络模型进行元训练可将对任务变化的弹性平均提高22%,同时实现相同水平的资源节省。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0d7f/11436138/4d668e9afdd3/sensors-24-05951-g001.jpg

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验