Ozcan Aydin, Coudert François-Xavier, Rogge Sven M J, Heydenrych Greta, Fan Dong, Sarikas Antonios P, Keskin Seda, Maurin Guillaume, Froudakis George E, Wuttke Stefan, Erucar Ilknur
TUBİTAK Marmara Research Center, Materials Technologies, Gebze, Kocaeli 41470, Türkiye.
Gebze Technical University, Kocaeli Gebze41400, Türkiye.
J Am Chem Soc. 2025 Jul 9;147(27):23367-23380. doi: 10.1021/jacs.5c08214. Epub 2025 Jun 24.
After the development of the famous "Transformer" network architecture and the meteoric rise of artificial intelligence (AI)-powered chatbots, large language models (LLMs) have become an indispensable part of our daily activities. In this rapidly evolving era, "all we need is attention" as Google's famous transformer paper's title [Vaswani et al., , 30] implies: We need to focus on and give "attention" to what we have at hand, then consider what we can do further. What can LLMs offer for immediate short-term adaptation? Currently, the most common applications in metal-organic framework (MOF) research include automating literature reviews and data extraction to accelerate the material discovery process. In this perspective, we discuss the latest developments in machine-learning and deep-learning research on MOF materials and reflect on how their utilization has evolved within the LLM domain from this standpoint. We finally explore future benefits to accelerate and automate materials development research.
在著名的“Transformer”网络架构发展以及人工智能驱动的聊天机器人迅速崛起之后,大语言模型(LLMs)已成为我们日常活动中不可或缺的一部分。在这个快速发展的时代,正如谷歌著名的Transformer论文标题[Vaswani等人, ,30]所暗示的那样,“我们所需要的只是注意力”:我们需要关注并“留意”手头所拥有的东西,然后思考我们还能进一步做些什么。大语言模型能为即时的短期适应提供什么?目前,在金属有机框架(MOF)研究中最常见的应用包括自动化文献综述和数据提取,以加速材料发现过程。从这个角度出发,我们讨论了MOF材料的机器学习和深度学习研究的最新进展,并从这个立场反思它们在大语言模型领域的应用是如何演变的。我们最终探索未来在加速和自动化材料开发研究方面的益处。