Jarne Cecilia
Departamento de Ciencia y Tecnologia de la Universidad Nacional de Quilmes, Bernal, Quilmes, Buenos Aires, Argentina.
CONICET, Buenos Aires, Argentina.
Front Syst Neurosci. 2024 Mar 27;18:1269190. doi: 10.3389/fnsys.2024.1269190. eCollection 2024.
Training neural networks to perform different tasks is relevant across various disciplines. In particular, Recurrent Neural Networks (RNNs) are of great interest in Computational Neuroscience. Open-source frameworks dedicated to Machine Learning, such as Tensorflow and Keras have produced significant changes in the development of technologies that we currently use. This work contributes by comprehensively investigating and describing the application of RNNs for temporal processing through a study of a 3-bit Flip Flop memory implementation. We delve into the entire modeling process, encompassing equations, task parametrization, and software development. The obtained networks are meticulously analyzed to elucidate dynamics, aided by an array of visualization and analysis tools. Moreover, the provided code is versatile enough to facilitate the modeling of diverse tasks and systems. Furthermore, we present how memory states can be efficiently stored in the vertices of a cube in the dimensionally reduced space, supplementing previous results with a distinct approach.
训练神经网络执行不同任务在各个学科中都具有相关性。特别是,循环神经网络(RNN)在计算神经科学中备受关注。致力于机器学习的开源框架,如TensorFlow和Keras,给我们当前使用的技术发展带来了重大变革。这项工作通过对一个3位触发器存储器实现的研究,全面调查和描述RNN在时间处理中的应用,从而做出了贡献。我们深入研究整个建模过程,包括方程、任务参数化和软件开发。借助一系列可视化和分析工具,对获得的网络进行细致分析以阐明其动态特性。此外,所提供的代码通用性强,足以方便对各种任务和系统进行建模。此外,我们展示了如何在降维空间中高效地将记忆状态存储在立方体的顶点中,用一种独特的方法补充了先前的结果。