Natal Jordão, Ávila Ivonete, Tsukahara Victor Batista, Pinheiro Marcelo, Maciel Carlos Dias
Signal Processing Laboratory, Department of Electrical and Computing Engineering, University of São Paulo (USP), São Carlos 3566-590, Brazil.
Laboratory of Combustion and Carbon Captur, Department of Energy, School of Engineering, State University of São Paulo (Unesp), São Carlos 3566-590, Brazil.
Entropy (Basel). 2021 Oct 14;23(10):1340. doi: 10.3390/e23101340.
Entropy is a concept that emerged in the 19th century. It used to be associated with heat harnessed by a thermal machine to perform work during the Industrial Revolution. However, there was an unprecedented scientific revolution in the 20th century due to one of its most essential innovations, i.e., the information theory, which also encompasses the concept of entropy. Therefore, the following question is naturally raised: "what is the difference, if any, between concepts of entropy in each field of knowledge?" There are misconceptions, as there have been multiple attempts to conciliate the entropy of thermodynamics with that of information theory. Entropy is most commonly defined as "disorder", although it is not a good analogy since "order" is a subjective human concept, and "disorder" cannot always be obtained from entropy. Therefore, this paper presents a historical background on the evolution of the term "entropy", and provides mathematical evidence and logical arguments regarding its interconnection in various scientific areas, with the objective of providing a theoretical review and reference material for a broad audience.
熵是一个在19世纪出现的概念。在工业革命期间,它曾与热机利用热量来做功相关联。然而,由于20世纪最关键的创新之一,即信息论,它也包含熵的概念,从而引发了一场前所未有的科学革命。因此,自然而然地会提出以下问题:“各个知识领域中的熵概念之间有何差异(如果有的话)?” 存在一些误解,因为已经有多次尝试将热力学熵与信息论熵协调起来。熵最常被定义为 “无序”,尽管这并不是一个很好的类比,因为 “有序” 是一个主观的人类概念,而且 “无序” 并非总能从熵中得出。因此,本文介绍了 “熵” 这一术语演变的历史背景,并提供了关于其在各个科学领域相互联系的数学证据和逻辑论证,目的是为广大读者提供一篇理论综述和参考资料。