Aguirre Luis Antonio, Lopes Rafael A M, Amaral Gleison F V, Letellier Christophe
Programa de Pós Graduçao em Engenharia Elétrica, Universidade Federal de Minas Gerais, Avenida Antônio Carlos 6627, 31270-901 Belo Horizonte, Minas Gerais, Brazil.
Phys Rev E Stat Nonlin Soft Matter Phys. 2004 Feb;69(2 Pt 2):026701. doi: 10.1103/PhysRevE.69.026701. Epub 2004 Feb 9.
This paper addresses the training of network models from data produced by systems with symmetry properties. It is argued that although general networks are global approximators, in practice some properties such as symmetry are very hard to learn from data. In order to guarantee that the final network will be symmetrical, constraints are developed for two types of models, namely, the multilayer perceptron (MLP) network and the radial basis function (RBF) network. In global modeling problems it becomes crucial to impose conditions for symmetry in order to stand a chance of reproducing symmetry-related phenomena. Sufficient conditions are given for MLP and RBF networks to have a set of fixed points that are symmetrical with respect to the origin of the phase space. In the case of MLP networks such conditions reduce to the absence of bias parameters and the requirement of odd activation functions. This turns out to be important from a dynamical point of view since some phenomena are only observed in the context of symmetry, which is not a structurally stable property. The results are illustrated using bench systems that display symmetry, such as the Duffing-Ueda oscillator and the Lorenz system.