![]() Years after Clausius’ definition, the Austrian physicist and philosopher Ludwig Boltzmann introduced the current formulation of entropy, giving it a statistical meaning that relates the microstates of matter (atoms, molecules) to the macrostate (observable) of the system. Bryan wrote in Nature that entropy is “that most difficult of all physical conceptions.” The formulation of entropy In coining the term, the physicist ended his work by summarising the first two laws of thermodynamics in this way: “The energy of the universe is constant,” and “The entropy of the universe tends to a maximum.” But Clausius’ choice of term has not made the concept any easier to understand as physicist and Nobel laureate Leon Cooper wrote in 1968, by choosing the term entropy, “rather than extracting a name from the body of the current language (say: lost heat), he succeeded in coining a word that meant the same thing to everybody: nothing.” In 1904, thermodynamics mathematician George H. This spontaneous behaviour of a system is the basic foundation of the second law of thermodynamics, as intuited by Clausius years before his definition of entropy. This is why “entropy in a thermodynamic sense is an energy divided by a temperature,” summarises chemist-physicist Emil Roduner, professor emeritus at the University of Stuttgart (Germany) to OpenMind. To express the unusable heat lost, he defined entropy-etymologically, a transformation of energy content-which measures how spontaneously a hot body gives up heat to a cold body as the system tends to equilibrium, unless interfered with to prevent it. Clausius was seeking to explain mathematically the workings of energy in the Carnot heat engine, an optimised model of the heat engine-on which the original diesel engine was based-proposed four decades earlier by the French engineer Sadi Carnot. When the Prussian physicist Rudolf Clausius defined entropy in 1865, the idea of disorder was nowhere to be found. But what does entropy actually mean? The Austrian Ludwig Boltzmann introduced the current formulation of entropy, giving it a statistical meaning, understood as the probability distribution between the different possible microstates. And yet, with this approximate meaning, it has almost become part of the lexicon of everyday life. Except that, in reality, it won’t: physicists are constantly explaining that no, entropy does not mean disorder. If we don’t get our house in order, we are told, entropy will eat us alive. And, of course, we all know what entropy means: disorder. We find it in the unintelligible phrases of some famous spiritual guru, and even in self-help advice and motivational coaching. It allows the user to explore how as the size of the system increases, the probability for the most likely energy configuration increases relative to other less likely configurations.There are certain words that can embellish any speech or quotation. Thomas Moore has developed a web application that models our two Einstein solid system. ![]() This is yet another piece of evidence that our universe is governed by quantum, rather than classical physics. Our analysis works because there is a finite number of microstates for each energy configuration. If each bond of our Einstein solid could have any value of energy (such as 1.25 or 345.8461 quanta), then there would be an infinite number of microstates for any energy configuration. The fact that energy comes in an integer multiple of quanta makes the number of microstates countable. To see this in more detail, consider reading this paper and performing some of the interactive demonstrations therein.Įssential to the Boltzmann definition of entropy is that energy is quantized. It helps to think about entropy as a measure of the spread of energy. While this is the formal definition, it doesn’t provide us with much physical intuition. So, if entropy is not disorder, what is it? The formal definition offered by Ludwig Boltzmann (and later written on his tombstone) is S= kBlnW, where S is the entropy of the system in a particular energy configuration, kB= 1.380×10−23J/K (Boltzmann's constant) and lnW is the natural logarithm of number of microstates for that energy configuration, or macrostate. Frank Lambert has written many articles on interpreting the meaning of entropy. Perhaps more importantly, disorder is usually taken to be a comment on the arrangement of a single microstate, yet entropy is defined in terms of all of the microstates for an energy configuration. ![]() Disorder is a subjective description that has no rigorous definition.Ģ. So why is “disorder” a misleading interpretation of entropy? There are two main problems:ġ.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |