Entropy  

From The Art and Popular Culture Encyclopedia

(Difference between revisions)
Jump to: navigation, search
Revision as of 21:18, 5 January 2008
Jahsonic (Talk | contribs)

← Previous diff
Revision as of 18:06, 27 February 2010
Goetzkluge (Talk | contribs)

Next diff →
Line 1: Line 1:
{{Template}} {{Template}}
-A [[measure]] of the [[amount]] of [[energy]] in a [[physical]] [[system]] which cannot be used to do [[mechanical work]].+In 1930, [[Gilbert Newton Lewis]] gave a simple explanation: ''"Gain in '''entropy''' always means loss of information, and nothing more."''
-A measure of the [[disorder]] [[present]] in a system (now becoming obsolete in chemistry +== Thermodynamics and Information ==
 +The term '''entropy''' is used in thermodynamics, information theory and mathematics. In classical thermodynamics entropy is a [[measure]] of the [[amount]] of [[energy]] in a [[physical]] [[system]] which cannot be used to do [[mechanical work]]. Here, the dimension for entopy is [[energy]]<nowiki>[</nowiki>[[Joule]]<nowiki>]</nowiki> devided by temperature<nowiki>[</nowiki>[[Kelvin]]<nowiki>]</nowiki>.
 + 
 +However, if a measuring system is used which uses thermal energy instead of temperature, then entropy is a dimensionless factor. In order to compute in such a measuring system the share of a ressource (e.g. energy or information) which is not-available to any determined or determinable operation, the measure of quantity of that ressource (e.g. using the units <nowiki>[</nowiki>Joule<nowiki>]</nowiki> or the pseudo unit <nowiki>[</nowiki>bit<nowiki>]</nowiki>) is multiplied by that factor.
 + 
 +== Disorder ==
 +Explaining entropy using the term "[[disorder]]" often leads to confusion, because disorder itself can be understood in too many different ways.
 + 
 +== Links ==
 +*Mark Dow: ''[http://lcni.uoregon.edu/~mark/Stat_mech/thermodynamic_entropy_and_information.html The connection between thermodynamic entropy and information]''
 + 
 +== References ==
 +*[http://www.ariehbennaim.com/books/index.html Ben-Naim, Arieh] (2008): ''A Farewell to Entropy: Statistical Thermodynamics Based on Information''
{{GFDL}} {{GFDL}}

Revision as of 18:06, 27 February 2010

Related e

Wikipedia
Wiktionary
Shop


Featured:

In 1930, Gilbert Newton Lewis gave a simple explanation: "Gain in entropy always means loss of information, and nothing more."

Contents

Thermodynamics and Information

The term entropy is used in thermodynamics, information theory and mathematics. In classical thermodynamics entropy is a measure of the amount of energy in a physical system which cannot be used to do mechanical work. Here, the dimension for entopy is energy[Joule] devided by temperature[Kelvin].

However, if a measuring system is used which uses thermal energy instead of temperature, then entropy is a dimensionless factor. In order to compute in such a measuring system the share of a ressource (e.g. energy or information) which is not-available to any determined or determinable operation, the measure of quantity of that ressource (e.g. using the units [Joule] or the pseudo unit [bit]) is multiplied by that factor.

Disorder

Explaining entropy using the term "disorder" often leads to confusion, because disorder itself can be understood in too many different ways.

Links

References

  • Ben-Naim, Arieh (2008): A Farewell to Entropy: Statistical Thermodynamics Based on Information




Unless indicated otherwise, the text in this article is either based on Wikipedia article "Entropy" or another language Wikipedia page thereof used under the terms of the GNU Free Documentation License; or on research by Jahsonic and friends. See Art and Popular Culture's copyright notice.

Personal tools