Ihme Matthias, Marsden Alison L, Pitsch Heinz
Department of Mechanical Engineering, Stanford University, Stanford, CA 94305, USA.
Neural Comput. 2008 Feb;20(2):573-601. doi: 10.1162/neco.2007.08-06-316.
A pattern search optimization method is applied to the generation of optimal artificial neural networks (ANNs). Optimization is performed using a mixed variable extension to the generalized pattern search method. This method offers the advantage that categorical variables, such as neural transfer functions and nodal connectivities, can be used as parameters in optimization. When used together with a surrogate, the resulting algorithm is highly efficient for expensive objective functions. Results demonstrate the effectiveness of this method in optimizing an ANN for the number of neurons, the type of transfer function, and the connectivity among neurons. The optimization method is applied to a chemistry approximation of practical relevance. In this application, temperature and a chemical source term are approximated as functions of two independent parameters using optimal ANNs. Comparison of the performance of optimal ANNs with conventional tabulation methods demonstrates equivalent accuracy by considerable savings in memory storage. The architecture of the optimal ANN for the approximation of the chemical source term consists of a fully connected feedforward network having four nonlinear hidden layers and 117 synaptic weights. An equivalent representation of the chemical source term using tabulation techniques would require a 500 x 500 grid point discretization of the parameter space.
一种模式搜索优化方法被应用于生成最优人工神经网络(ANN)。使用广义模式搜索方法的混合变量扩展来进行优化。该方法的优点是,诸如神经传递函数和节点连接性等分类变量可以用作优化参数。当与代理模型一起使用时,所得算法对于昂贵的目标函数非常高效。结果证明了该方法在优化ANN的神经元数量、传递函数类型和神经元之间的连接性方面的有效性。该优化方法被应用于具有实际相关性的化学近似。在这个应用中,使用最优ANN将温度和化学源项近似为两个独立参数的函数。将最优ANN的性能与传统列表方法进行比较,结果表明在显著节省内存存储的情况下具有同等的准确性。用于化学源项近似的最优ANN的架构由一个具有四个非线性隐藏层和117个突触权重的全连接前馈网络组成。使用列表技术对化学源项进行等效表示将需要对参数空间进行500×500的网格点离散化。