Dehghani Mohammad Mehdi, Mehrany Khashayar, Memarian Mohammad
Opt Express. 2024 Oct 21;32(22):39160-39176. doi: 10.1364/OE.531165.
Optical neural networks (ONNs) are custom optical circuits promising a breakthrough in low-power, parallelized, and high-speed hardware, for the growing demands of artificial intelligence applications. All-optical implementation of ONNs has proven burdensome chiefly due to the lack of optical devices that can emulate the neurons' non-linear activation function, thus forcing hybrid optical-electronic implementations. Moreover, ONNs suffer from a large footprint in comparison to their electronic (CMOS-based) counterparts. Utilizing virtual optical neurons in time or frequency domain can reduce the number of required physical neurons, but an all-optical activation function is still required, especially where several layers comprised of multiple neurons are required for deep networks. Here we propose an all-optical multi-wavelength-channel rectified linear unit (ReLU) activation function, by leveraging χ nonlinearity across more than 100 wavelength channels simultaneously. Our design significantly reduces the footprint of ONNs by consolidating all of the nonlinear activation functions present in each layer of an ONN into a single physical device with a broad bandwidth. This enables the realization of all-optical low-footprint ONNs with multiple layers made of several virtual neurons whose outputs are computed by a single ReLU activation function. We demonstrate this by simulating a 16-channel ReLU function in a realistic ONN and performing a multi-class classification task with a validation accuracy of 98.05%.
光学神经网络(ONNs)是定制的光学电路,有望在低功耗、并行化和高速硬件方面取得突破,以满足人工智能应用不断增长的需求。ONNs的全光实现已被证明很困难,主要是因为缺乏能够模拟神经元非线性激活函数的光学器件,因此不得不采用光电混合实现方式。此外,与基于电子(CMOS)的对应物相比,ONNs占用的空间较大。在时域或频域中使用虚拟光学神经元可以减少所需物理神经元的数量,但仍然需要全光激活函数,特别是在深度网络需要由多个神经元组成的多层结构的情况下。在此,我们通过同时利用100多个波长通道上的χ非线性,提出了一种全光多波长通道整流线性单元(ReLU)激活函数。我们的设计通过将ONN每层中存在的所有非线性激活函数整合到一个具有宽带宽的单个物理设备中,显著减小了ONNs的占用空间。这使得能够实现由多个虚拟神经元组成多层的全光小尺寸ONNs,其输出由单个ReLU激活函数计算。我们通过在实际的ONN中模拟16通道ReLU函数并执行多类分类任务,验证准确率达到98.05%,来证明这一点。