Entropy
From The Art and Popular Culture Encyclopedia
Related e |
Featured: |
In 1930, Gilbert Newton Lewis gave a simple explanation: "Gain in entropy always means loss of information, and nothing more."
Contents |
Thermodynamics and Information
The term entropy is used in thermodynamics, information theory and mathematics. In classical thermodynamics entropy is a measure of the amount of energy in a physical system which cannot be used to do mechanical work. Here, the dimension for entopy is energy[Joule] devided by temperature[Kelvin].
However, if a measuring system is used which uses thermal energy instead of temperature, then entropy is a dimensionless factor. In order to compute in such a measuring system the share of a ressource (e.g. energy or information) which is not-available to any determined or determinable operation, the measure of quantity of that ressource (e.g. using the units [Joule] or the pseudo unit [bit]) is multiplied by that factor.
Disorder
Explaining entropy using the term "disorder" often leads to confusion, because disorder itself can be understood in too many different ways.
Links
References
- Ben-Naim, Arieh (2008): A Farewell to Entropy: Statistical Thermodynamics Based on Information