Flynn Thomas
Graduate Center, City University of New York, New York, NY 10016, U.S.A.
Neural Comput. 2015 Jun;27(6):1321-44. doi: 10.1162/NECO_a_00740. Epub 2015 Mar 31.
Supervised learning in recurrent neural networks involves two processes: the neuron activity from which gradients are estimated and the process on connection parameters induced by these measurements. A problem such algorithms must address is how to balance the relative rates of these activities so that accurate sensitivity estimates are obtained while still allowing synaptic modification to take place at a rate sufficient for learning. We show how to calculate a sufficient timescale separation between these two processes for a class of contracting neural networks.