Suppr超能文献

一种用于大规模优化的具有自适应通信的分布式群体优化器。

A Distributed Swarm Optimizer With Adaptive Communication for Large-Scale Optimization.

作者信息

Yang Qiang, Chen Wei-Neng, Gu Tianlong, Zhang Huaxiang, Yuan Huaqiang, Kwong Sam, Zhang Jun

出版信息

IEEE Trans Cybern. 2020 Jul;50(7):3393-3408. doi: 10.1109/TCYB.2019.2904543. Epub 2019 Apr 9.

Abstract

Large-scale optimization with high dimensionality and high computational cost becomes ubiquitous nowadays. To tackle such challenging problems efficiently, devising distributed evolutionary computation algorithms is imperative. To this end, this paper proposes a distributed swarm optimizer based on a special master-slave model. Specifically, in this distributed optimizer, the master is mainly responsible for communication with slaves, while each slave iterates a swarm to traverse the solution space. An asynchronous and adaptive communication strategy based on the request-response mechanism is especially devised to let the slaves communicate with the master efficiently. Particularly, the communication between the master and each slave is adaptively triggered during the iteration. To aid the slaves to search the space efficiently, an elite-guided learning strategy is especially designed via utilizing elite particles in the current swarm and historically best solutions found by different slaves to guide the update of particles. Together, this distributed optimizer asynchronously iterates multiple swarms to collaboratively seek the optimum in parallel. Extensive experiments on a widely used large-scale benchmark set substantiate that the distributed optimizer could: 1) achieve competitive effectiveness in terms of solution quality as compared to the state-of-the-art large-scale methods; 2) accelerate the execution of the algorithm in comparison with the sequential one and obtain almost linear speedup as the number of cores increases; and 3) preserve a good scalability to solve higher dimensional problems.

摘要

如今,具有高维度和高计算成本的大规模优化变得十分普遍。为了高效解决此类具有挑战性的问题,设计分布式进化计算算法势在必行。为此,本文提出了一种基于特殊主从模型的分布式群体优化器。具体而言,在这个分布式优化器中,主节点主要负责与从节点通信,而每个从节点迭代一个群体以遍历解空间。特别设计了一种基于请求 - 响应机制的异步自适应通信策略,以使从节点能够高效地与主节点通信。具体来说,主节点与每个从节点之间的通信在迭代过程中被自适应触发。为了帮助从节点有效地搜索空间,特别设计了一种精英引导学习策略,通过利用当前群体中的精英粒子以及不同从节点找到的历史最佳解来指导粒子的更新。总之,这个分布式优化器异步迭代多个群体以并行协作地寻找最优解。在一个广泛使用的大规模基准测试集上进行的大量实验证实,该分布式优化器能够:1)与最先进的大规模方法相比,在解质量方面实现有竞争力的效果;2)与顺序算法相比加速算法执行,并且随着核心数量的增加获得几乎线性的加速比;3)保持良好的扩展性以解决更高维度的问题。

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验