Bauwens Ian, Harkhoe Krishan, Bienstman Peter, Verschaffelt Guy, Van der Sande Guy
Applied Physics Research Group, Vrije Universiteit Brussel, Pleinlaan 2, 1050 Brussels, Belgium.
Photonics Research Group, Department of Information Technology, Ghent University-IMEC, Technologiepark Zwijnaarde 126, 9052 Ghent, Belgium.
Nanophotonics. 2022 Oct 18;12(5):949-961. doi: 10.1515/nanoph-2022-0399. eCollection 2023 Mar.
Photonic reservoir computing has been demonstrated to be able to solve various complex problems. Although training a reservoir computing system is much simpler compared to other neural network approaches, it still requires considerable amounts of resources which becomes an issue when retraining is required. Transfer learning is a technique that allows us to re-use information between tasks, thereby reducing the cost of retraining. We propose transfer learning as a viable technique to compensate for the unavoidable parameter drift in experimental setups. Solving this parameter drift usually requires retraining the system, which is very time and energy consuming. Based on numerical studies on a delay-based reservoir computing system with semiconductor lasers, we investigate the use of transfer learning to mitigate these parameter fluctuations. Additionally, we demonstrate that transfer learning applied to two slightly different tasks allows us to reduce the amount of input samples required for training of the second task, thus reducing the amount of retraining.
光子储层计算已被证明能够解决各种复杂问题。尽管与其他神经网络方法相比,训练储层计算系统要简单得多,但它仍然需要大量资源,当需要重新训练时,这就成了一个问题。迁移学习是一种允许我们在不同任务之间重用信息的技术,从而降低重新训练的成本。我们提出迁移学习作为一种可行的技术,以补偿实验装置中不可避免的参数漂移。解决这种参数漂移通常需要重新训练系统,这非常耗时且耗能。基于对带有半导体激光器的基于延迟的储层计算系统的数值研究,我们研究了使用迁移学习来减轻这些参数波动。此外,我们证明了将迁移学习应用于两个略有不同的任务,可以减少训练第二个任务所需的输入样本数量,从而减少重新训练的量。