Suppr超能文献

基于受激布里渊散射的全光非线性激活函数。

All-optical nonlinear activation function based on stimulated Brillouin scattering.

作者信息

Slinkov Grigorii, Becker Steven, Englund Dirk, Stiller Birgit

机构信息

Max-Planck-Institute for the Science of Light, Staudtstr. 2, 91058 Erlangen, Germany.

Department of Physics, Friedrich-Alexander-Universität Erlangen-Nürnberg, Staudtstr. 7, 91058 Erlangen, Germany.

出版信息

Nanophotonics. 2025 Feb 14;14(16):2711-2722. doi: 10.1515/nanoph-2024-0513. eCollection 2025 Aug.

Abstract

Optical neural networks have demonstrated their potential to overcome the computational bottleneck of modern digital electronics. However, their development towards high-performing computing alternatives is hindered by one of the optical neural networks' key components: the activation function. Most of the reported activation functions rely on opto-electronic conversion, sacrificing the unique advantages of photonics, such as resource-efficient coherent and frequency-multiplexed information encoding. Here, we experimentally demonstrate a photonic nonlinear activation function based on stimulated Brillouin scattering. It is coherent and frequency selective and can be tuned all-optically to take LeakyReLU, Sigmoid, and Quadratic shape. Our design compensates for the insertion loss automatically by providing net gain as high as 20 dB, paving the way for deep optical neural networks.

摘要

光学神经网络已展现出克服现代数字电子学计算瓶颈的潜力。然而,它们向高性能计算替代方案的发展受到光学神经网络关键组件之一的阻碍:激活函数。大多数已报道的激活函数依赖光电转换,牺牲了光子学的独特优势,比如资源高效的相干和频率复用信息编码。在此,我们通过实验展示了一种基于受激布里渊散射的光子非线性激活函数。它具有相干性和频率选择性,并且可以全光调谐以呈现泄漏修正线性单元(LeakyReLU)、Sigmoid和二次函数形状。我们的设计通过提供高达20 dB的净增益自动补偿插入损耗,为深度光学神经网络铺平了道路。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/5840/12338876/417a9581b058/j_nanoph-2024-0513_fig_001.jpg

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验