Entropy  

From The Art and Popular Culture Encyclopedia

(Difference between revisions)
Jump to: navigation, search
Revision as of 09:05, 1 June 2012
Jahsonic (Talk | contribs)

← Previous diff
Current revision
Jahsonic (Talk | contribs)

Line 1: Line 1:
 +{| class="toccolours" style="float: left; margin-left: 1em; margin-right: 2em; font-size: 85%; background:#c6dbf7; color:black; width:30em; max-width: 40%;" cellspacing="5"
 +| style="text-align: left;" |
 +"It was not easy for a person brought up in the ways of classical [[thermodynamics]] to come around to the idea that gain of [[entropy]] eventually is nothing more nor less than [[loss of information]]."--[[Gilbert Newton Lewis]] in a letter to [[Irving Langmuir]], 5 Aug 1930
 +|}
{{Template}} {{Template}}
-In 1930, [[Gilbert Newton Lewis]] gave a simple explanation: ''"[[Gain in entropy always means loss of information, and nothing more]]."'' 
-== Thermodynamics and Information ==+'''Entropy''' is a scientific concept as well as a measurable physical property that is most commonly associated with a state of [[disorder]], [[randomness]], or [[uncertainty]]. The term and the concept are used in diverse fields, from [[classical thermodynamics]], where it was first recognized, to the microscopic description of nature in [[statistical physics]], and to the principles of [[information theory]]. It has found far-ranging applications in [[chemistry]] and [[physics]], in biological systems and their relation to life, in [[cosmology]], [[economics]], [[sociology]], [[Atmospheric science|weather science]], [[climate change]], and [[information system]]s including the transmission of information in [[telecommunication]].
-The term '''entropy''' is used in thermodynamics, information theory and mathematics. In classical thermodynamics entropy is a [[measure]] of the [[amount]] of [[energy]] in a [[physical]] [[system]] which cannot be used to do [[mechanical work]]. Here, the dimension for entopy is [[energy]]<nowiki>[</nowiki>[[Joule]]<nowiki>]</nowiki> devided by temperature<nowiki>[</nowiki>[[Kelvin]]<nowiki>]</nowiki>.+== See also ==
 +* [[Autocatalytic reactions and order creation]]
 +* [[Boltzmann entropy]] – a type of Gibbs entropy, which neglects internal statistical correlations in the overall particle distribution
 +* [[Brownian ratchet]]
 +* [[Clausius–Duhem inequality]]
 +* [[Configuration entropy]]
 +* [[Conformational entropy]] – associated with the physical arrangement of a [[polymer]] chain that assumes a compact or [[globular protein|globular]] state in solution
 +* [[Departure function]]
 +* [[Enthalpy]]
 +* [[Entropic explosion]] – an explosion in which the reactants expand without releasing much heat
 +* [[Entropic force]]
 +* [[Entropy unit]]
 +* [[Entropic value at risk]]
 +* [[Entropy (information theory)]]
 +* [[Entropy (computing)]]
 +* [[Entropy (statistical thermodynamics)]]
 +* [[Entropy and life]]
 +* [[Entropy (order and disorder)]]
 +* [[Entropy of mixing]] – the change in the entropy when two different [[chemical substance]]s or [[component (thermodynamics)|components]] are mixed
 +* [[Entropy rate]]
 +* [[Entropy production]]
 +* [[Extropianism#Extropy|Extropy]]
 +* [[Free entropy]] – a thermodynamic potential analogous to free energy
 +* [[Geometrical frustration]]
 +* [[Gibbs entropy]] – a precise definition of entropy
 +* [[Harmonic entropy]]
 +* [[Heat death of the universe]]
 +* [[Info-metrics]]
 +* [[Laws of thermodynamics]]
 +* [[Loop entropy]] – is the entropy lost upon bringing together two residues of a polymer within a prescribed distance
 +* [[Multiplicity function]]
 +* [[Negentropy]] (negative entropy)
 +* [[Orders of magnitude (entropy)]]
 +* [[Phase space#Thermodynamics and statistical mechanics|Phase space]]
 +* [[Principle of maximum entropy]]
 +* [[Residual entropy]] – the entropy present after a substance is cooled arbitrarily close to [[absolute zero]]
 +* [[Sackur–Tetrode equation|Sackur–Tetrode entropy]] – the entropy of a monatomic classical ideal gas determined via quantum considerations
 +* [[Standard molar entropy]] – the entropy in one mole of substance under [[standard temperature and pressure|STP]]
 +* [[Stirling's formula]]
 +* [[Thermodynamic databases for pure substances]]
 +* [[Thermodynamic potential]]
 +* [[Thermodynamic equilibrium]]
 +* [[Tsallis entropy]] – a generalization of the Boltzmann and Gibbs definitions
-However, if a measuring system is used where ''temperature'' has been replaced by ''thermal energy'', then entropy is a dimensionless factor. In order to compute in such a measuring system the share of a ressource (e.g. energy or information) which is not-available to any determined or determinable operation, the measure of quantity of that ressource (e.g. using the units <nowiki>[</nowiki>Joule<nowiki>]</nowiki> or the pseudo unit <nowiki>[</nowiki>bit<nowiki>]</nowiki>) is multiplied by that factor. "Degree of unavailability" is the translation of "entropy" into plain English. 
- 
-== Disorder == 
-Explaining entropy using the term "[[disorder]]" often leads to confusion, because disorder itself can be understood in too many different ways. Example: If entropy is computed based on the distribution of ressources in a space, high concentration stands for minimum entropy and completely even distribution stands for maximum entropy. Depending on personal views, both stares could be seen as desirable states of high order. 
- 
-== Significance of Entropy == 
-The availibility (and unavailibility) of energy at certain locations and times is significant to the organisms in the biosphere. Therefore understanding entropy is important to humans. The biosphere is an open system, but this openess is ''bounded''. Thus, the consumption of ressources (e.g. increasing unavailibility of energy and information) within the system can be compensated by export of entropy (e.g. import of availibility of energy and information) only to a limited degree. If the consumption exeeds the limit, among the various effects of increased entropy, "[[global warming]]" ''may'' be only one possible result. From a viewpoint of humans, the general effect of increasing entropy production beyond the limits of entropy compensation is the ''decreasing predictability of the processes within the biosphere''. Practically this means: The availibility of ressources becomes less secure. 
- 
-== Redundancy == 
-Redundancy (based on the definition of information definition of redundancy ISO 2382-16) is the gap between the maximum entropy which a system can handle and the entropy which the system actually is experiencing. In the econometrics of welfare economics, the ''Theil inequality measure'' is a redundancy, as it expresses the distance between the entropy of a theoretically even ressource distribution and the entropy of a measured ressource dirstribution.  
- 
-== Links == 
-*Mark Dow: ''[http://lcni.uoregon.edu/~mark/Stat_mech/thermodynamic_entropy_and_information.html The connection between thermodynamic entropy and information]'' 
- 
-== References == 
-*[[Arnheim, Rudolf]] (1971, 1983): ''[[Entropy and Art - An Essay on Disorder and Order]]'' (Interesting, but always take care, from which point of view the term "disorder" is used.) 
-*[http://www.ariehbennaim.com/books/index.html Ben-Naim, Arieh] (2008): ''A Farewell to Entropy: Statistical Thermodynamics Based on Information'' 
{{GFDL}} {{GFDL}}

Current revision

"It was not easy for a person brought up in the ways of classical thermodynamics to come around to the idea that gain of entropy eventually is nothing more nor less than loss of information."--Gilbert Newton Lewis in a letter to Irving Langmuir, 5 Aug 1930

Related e

Wikipedia
Wiktionary
Shop


Featured:

Entropy is a scientific concept as well as a measurable physical property that is most commonly associated with a state of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodynamics, where it was first recognized, to the microscopic description of nature in statistical physics, and to the principles of information theory. It has found far-ranging applications in chemistry and physics, in biological systems and their relation to life, in cosmology, economics, sociology, weather science, climate change, and information systems including the transmission of information in telecommunication.

See also




Unless indicated otherwise, the text in this article is either based on Wikipedia article "Entropy" or another language Wikipedia page thereof used under the terms of the GNU Free Documentation License; or on research by Jahsonic and friends. See Art and Popular Culture's copyright notice.

Personal tools