Otwinowski Jakub, LaMont Colin H, Nourmohammad Armita
Max Planck Institute for Dynamics and Self-Organization, 37077 Göttingen, Germany.
Physics Department, University of Washington, Seattle, WA 98195, USA.
Entropy (Basel). 2020 Aug 31;22(9):967. doi: 10.3390/e22090967.
Evolutionary algorithms, inspired by natural evolution, aim to optimize difficult objective functions without computing derivatives. Here we detail the relationship between classical population genetics of quantitative traits and evolutionary optimization, and formulate a new evolutionary algorithm. Optimization of a continuous objective function is analogous to searching for high fitness phenotypes on a fitness landscape. We describe how natural selection moves a population along the non-Euclidean gradient that is induced by the population on the fitness landscape (the natural gradient). We show how selection is related to Newton's method in optimization under quadratic fitness landscapes, and how selection increases fitness at the cost of reducing diversity. We describe the generation of new phenotypes and introduce an operator that recombines the whole population to generate variants. Finally, we introduce a proof-of-principle algorithm that combines natural selection, our recombination operator, and an adaptive method to increase selection and find the optimum. The algorithm is extremely simple in implementation; it has no matrix inversion or factorization, does not require storing a covariance matrix, and may form the basis of more general model-based optimization algorithms with natural gradient updates.
受自然进化启发的进化算法旨在在不计算导数的情况下优化复杂的目标函数。在此,我们详细阐述数量性状的经典群体遗传学与进化优化之间的关系,并制定一种新的进化算法。连续目标函数的优化类似于在适应度景观上寻找高适应度的表型。我们描述了自然选择如何使种群沿着由种群在适应度景观上诱导的非欧几里得梯度(自然梯度)移动。我们展示了在二次适应度景观下选择与优化中的牛顿法是如何相关的,以及选择如何以降低多样性为代价提高适应度。我们描述了新表型的产生,并引入了一种重组整个种群以产生变体的算子。最后,我们引入一种原理验证算法,该算法结合了自然选择、我们的重组算子和一种自适应方法,以增加选择并找到最优解。该算法在实现上极其简单;它无需矩阵求逆或分解,不需要存储协方差矩阵,并且可能构成基于更通用的自然梯度更新的基于模型的优化算法的基础。