Entropy  

From The Art and Popular Culture Encyclopedia

(Difference between revisions)
Jump to: navigation, search
Revision as of 19:33, 27 February 2010
Goetzkluge (Talk | contribs)
(Significance of Entropy)
← Previous diff
Revision as of 09:03, 25 March 2020
Jahsonic (Talk | contribs)

Next diff →
Line 1: Line 1:
{{Template}} {{Template}}
-In 1930, [[Gilbert Newton Lewis]] gave a simple explanation: ''"Gain in '''entropy''' always means loss of information, and nothing more."'' 
-== Thermodynamics and Information ==+In [[statistical mechanics]], '''entropy''' is an [[Intensive and extensive properties#Extensive properties|extensive property]] of a [[thermodynamic system]]. It is closely related to the number {{math|Ω}} of microscopic configurations (known as [[Microstate (statistical mechanics)|microstates]]) that are consistent with the macroscopic quantities that characterize the system (such as its volume, pressure and temperature). Entropy expresses the number {{math|Ω}} of different configurations that a system defined by macroscopic variables could assume.
-The term '''entropy''' is used in thermodynamics, information theory and mathematics. In classical thermodynamics entropy is a [[measure]] of the [[amount]] of [[energy]] in a [[physical]] [[system]] which cannot be used to do [[mechanical work]]. Here, the dimension for entopy is [[energy]]<nowiki>[</nowiki>[[Joule]]<nowiki>]</nowiki> devided by temperature<nowiki>[</nowiki>[[Kelvin]]<nowiki>]</nowiki>.+== See also ==
-However, if a measuring system is used which uses thermal energy instead of temperature, then entropy is a dimensionless factor. In order to compute in such a measuring system the share of a ressource (e.g. energy or information) which is not-available to any determined or determinable operation, the measure of quantity of that ressource (e.g. using the units <nowiki>[</nowiki>Joule<nowiki>]</nowiki> or the pseudo unit <nowiki>[</nowiki>bit<nowiki>]</nowiki>) is multiplied by that factor. "Degree of unavailability" is the translation of "entropy" into plain English.+* [[Autocatalytic reactions and order creation]]
 +* [[Brownian ratchet]]
 +* [[Clausius–Duhem inequality]]
 +* [[Configuration entropy]]
 +* [[Departure function]]
 +* [[Enthalpy]]
 +* [[Entropic force]]
 +* [[Entropic value at risk]]
 +* [[Entropy (information theory)]]
 +* [[Entropy (computing)]]
 +* [[Entropy and life]]
 +* [[Entropy (order and disorder)]]
 +* [[Entropy rate]]
 +* [[Entropy production]]
 +* [[Extropianism#Extropy|Extropy]]
 +* [[Geometrical frustration]]
 +* [[Harmonic entropy]]
 +* [[Heat death of the universe]]
 +* [[Info-metrics]]
 +* [[Laws of thermodynamics]]
 +* [[Multiplicity function]]
 +* [[Negentropy]] (negative entropy)
 +* [[Orders of magnitude (entropy)]]
 +* [[Phase space#Thermodynamics and statistical mechanics|Phase space]]
 +* [[Principle of maximum entropy]]
 +* [[Stirling's formula]]
 +* [[Thermodynamic databases for pure substances]]
 +* [[Thermodynamic potential]]
 +* [[Thermodynamic equilibrium]]
 +* Wavelet entropy
-== Disorder == 
-Explaining entropy using the term "[[disorder]]" often leads to confusion, because disorder itself can be understood in too many different ways. 
- 
-== Significance of Entropy == 
-The availibility (and unavailibility) of energy at certain locations and times is significant to the organisms in the biosphere. Therefore understanding entropy is important to humans. The biosphere is an open system, but this openess is ''bounded''. Thus, the consumption of ressources (e.g. increasing unavailibility of energy and information) within the system can be compensated by export of entropy (e.g. import of availibility of energy and information) only to a limited degree. If the consumption exeeds the limit, among the verious effects of increased entropy, "[[global warming]]" may be only one possible result. From a viewpoint of humans, the general effect of increasing entropy production beyond the limits of entropy compensation is the ''decreasing predictability of the processes within the biosphere''. Practically this means: The availibility of ressources becomes less secure. 
- 
-== Redundancy == 
-Redundancy (based on the definition of information definition of redundancy ISO 2382-16) is the gap between the maximum entropy which a system can handle and the entropy which the system actually is experiencing. 
- 
-== Links == 
-*Mark Dow: ''[http://lcni.uoregon.edu/~mark/Stat_mech/thermodynamic_entropy_and_information.html The connection between thermodynamic entropy and information]'' 
- 
-== References == 
-*[http://www.ariehbennaim.com/books/index.html Ben-Naim, Arieh] (2008): ''A Farewell to Entropy: Statistical Thermodynamics Based on Information'' 
{{GFDL}} {{GFDL}}

Revision as of 09:03, 25 March 2020

Related e

Wikipedia
Wiktionary
Shop


Featured:

In statistical mechanics, entropy is an extensive property of a thermodynamic system. It is closely related to the number Template:Math of microscopic configurations (known as microstates) that are consistent with the macroscopic quantities that characterize the system (such as its volume, pressure and temperature). Entropy expresses the number Template:Math of different configurations that a system defined by macroscopic variables could assume.

See also




Unless indicated otherwise, the text in this article is either based on Wikipedia article "Entropy" or another language Wikipedia page thereof used under the terms of the GNU Free Documentation License; or on research by Jahsonic and friends. See Art and Popular Culture's copyright notice.

Personal tools