Martin Charles E, Reggia James A
HRL Laboratories, LLC, 3011 Malibu Canyon Road, Malibu, CA 90265, USA.
Department of Computer Science, University of Maryland, College Park, MD 20742, USA.
Comput Intell Neurosci. 2015;2015:642429. doi: 10.1155/2015/642429. Epub 2015 Aug 4.
Optimizing a neural network's topology is a difficult problem for at least two reasons: the topology space is discrete, and the quality of any given topology must be assessed by assigning many different sets of weights to its connections. These two characteristics tend to cause very "rough." objective functions. Here we demonstrate how self-assembly (SA) and particle swarm optimization (PSO) can be integrated to provide a novel and effective means of concurrently optimizing a neural network's weights and topology. Combining SA and PSO addresses two key challenges. First, it creates a more integrated representation of neural network weights and topology so that we have just a single, continuous search domain that permits "smoother" objective functions. Second, it extends the traditional focus of self-assembly, from the growth of predefined target structures, to functional self-assembly, in which growth is driven by optimality criteria defined in terms of the performance of emerging structures on predefined computational problems. Our model incorporates a new way of viewing PSO that involves a population of growing, interacting networks, as opposed to particles. The effectiveness of our method for optimizing echo state network weights and topologies is demonstrated through its performance on a number of challenging benchmark problems.
优化神经网络的拓扑结构是一个难题,至少有两个原因:拓扑空间是离散的,并且任何给定拓扑结构的质量必须通过为其连接分配许多不同的权重集来评估。这两个特性往往会导致非常“粗糙”的目标函数。在这里,我们展示了如何将自组装(SA)和粒子群优化(PSO)集成起来,以提供一种新颖且有效的方法来同时优化神经网络的权重和拓扑结构。将SA和PSO相结合解决了两个关键挑战。首先,它创建了一种更综合的神经网络权重和拓扑结构表示,这样我们就只有一个单一的、连续的搜索域,从而允许“更平滑”的目标函数。其次,它将自组装的传统重点从预定义目标结构的生长扩展到功能自组装,在功能自组装中,生长由根据新兴结构在预定义计算问题上的性能定义的最优性标准驱动。我们的模型引入了一种看待PSO的新方式,它涉及一群不断生长、相互作用的网络,而不是粒子。通过在一些具有挑战性的基准问题上的表现,证明了我们优化回声状态网络权重和拓扑结构方法的有效性。