Entropy
From The Art and Popular Culture Encyclopedia
Revision as of 19:43, 27 February 2010 Goetzkluge (Talk | contribs) (→Significance of Entropy - typo) ← Previous diff |
Revision as of 19:53, 27 February 2010 Goetzkluge (Talk | contribs) (→References - Arnheim, Rudolf (1974): Entropy and Art - An Essay on Disorder and Order) Next diff → |
||
Line 20: | Line 20: | ||
== References == | == References == | ||
+ | *Arnheim, Rudolf (1974): ''Entropy and Art - An Essay on Disorder and Order'' (Interesting, but always take care, from which point of view the term "disorder" is used.) | ||
*[http://www.ariehbennaim.com/books/index.html Ben-Naim, Arieh] (2008): ''A Farewell to Entropy: Statistical Thermodynamics Based on Information'' | *[http://www.ariehbennaim.com/books/index.html Ben-Naim, Arieh] (2008): ''A Farewell to Entropy: Statistical Thermodynamics Based on Information'' | ||
{{GFDL}} | {{GFDL}} |
Revision as of 19:53, 27 February 2010
Related e |
Featured: |
In 1930, Gilbert Newton Lewis gave a simple explanation: "Gain in entropy always means loss of information, and nothing more."
Contents |
Thermodynamics and Information
The term entropy is used in thermodynamics, information theory and mathematics. In classical thermodynamics entropy is a measure of the amount of energy in a physical system which cannot be used to do mechanical work. Here, the dimension for entopy is energy[Joule] devided by temperature[Kelvin].
However, if a measuring system is used which uses thermal energy instead of temperature, then entropy is a dimensionless factor. In order to compute in such a measuring system the share of a ressource (e.g. energy or information) which is not-available to any determined or determinable operation, the measure of quantity of that ressource (e.g. using the units [Joule] or the pseudo unit [bit]) is multiplied by that factor. "Degree of unavailability" is the translation of "entropy" into plain English.
Disorder
Explaining entropy using the term "disorder" often leads to confusion, because disorder itself can be understood in too many different ways.
Significance of Entropy
The availibility (and unavailibility) of energy at certain locations and times is significant to the organisms in the biosphere. Therefore understanding entropy is important to humans. The biosphere is an open system, but this openess is bounded. Thus, the consumption of ressources (e.g. increasing unavailibility of energy and information) within the system can be compensated by export of entropy (e.g. import of availibility of energy and information) only to a limited degree. If the consumption exeeds the limit, among the various effects of increased entropy, "global warming" may be only one possible result. From a viewpoint of humans, the general effect of increasing entropy production beyond the limits of entropy compensation is the decreasing predictability of the processes within the biosphere. Practically this means: The availibility of ressources becomes less secure.
Redundancy
Redundancy (based on the definition of information definition of redundancy ISO 2382-16) is the gap between the maximum entropy which a system can handle and the entropy which the system actually is experiencing.
Links
References
- Arnheim, Rudolf (1974): Entropy and Art - An Essay on Disorder and Order (Interesting, but always take care, from which point of view the term "disorder" is used.)
- Ben-Naim, Arieh (2008): A Farewell to Entropy: Statistical Thermodynamics Based on Information