Entropy  

From The Art and Popular Culture Encyclopedia

Revision as of 09:06, 1 June 2012; view current revision
←Older revision | Newer revision→
Jump to: navigation, search

Related e

Wikipedia
Wiktionary
Shop


Featured:

Entropy is a thermodynamic property that can be used to determine the energy not available for work in a thermodynamic process, such as in energy conversion devices, engines, or machines. Such devices can only be driven by convertible energy, and have a theoretical maximum efficiency when converting energy to work. During this work, entropy accumulates in the system, which then dissipates in the form of waste heat.

In classical thermodynamics, the concept of entropy is defined phenomenologically by the second law of thermodynamics, which states that the entropy of an isolated system always increases or remains constant. Thus, entropy is also a measure of the tendency of a process, such as a chemical reaction, to be entropically favored, or to proceed in a particular direction. It determines that thermal energy always flows spontaneously from regions of higher temperature to regions of lower temperature, in the form of heat. These processes reduce the state of order of the initial systems, and therefore entropy is an expression of disorder or randomness. This is the basis of the modern microscopic interpretation of entropy in statistical mechanics, where entropy is defined as the amount of additional information needed to specify the exact physical state of a system, given its thermodynamic specification. The second law is then a consequence of this definition and the fundamental postulate of statistical mechanics.

Thermodynamic entropy has the dimension of energy divided by temperature, which has a unit of joules per kelvin (J/K) in the International System of Units.

The term entropy was coined in 1865 by Rudolf Clausius based on the Greek εντροπία [entropía], a turning toward, from εν- [en-] (in) and τροπή [tropē] (turn, conversion).

Contents

Thermodynamics and Information

The term entropy is used in thermodynamics, information theory and mathematics. In classical thermodynamics entropy is a measure of the amount of energy in a physical system which cannot be used to do mechanical work. Here, the dimension for entopy is energy[Joule] devided by temperature[Kelvin].

However, if a measuring system is used where temperature has been replaced by thermal energy, then entropy is a dimensionless factor. In order to compute in such a measuring system the share of a ressource (e.g. energy or information) which is not-available to any determined or determinable operation, the measure of quantity of that ressource (e.g. using the units [Joule] or the pseudo unit [bit]) is multiplied by that factor. "Degree of unavailability" is the translation of "entropy" into plain English.

Disorder

Explaining entropy using the term "disorder" often leads to confusion, because disorder itself can be understood in too many different ways. Example: If entropy is computed based on the distribution of ressources in a space, high concentration stands for minimum entropy and completely even distribution stands for maximum entropy. Depending on personal views, both stares could be seen as desirable states of high order.

Significance of Entropy

The availibility (and unavailibility) of energy at certain locations and times is significant to the organisms in the biosphere. Therefore understanding entropy is important to humans. The biosphere is an open system, but this openess is bounded. Thus, the consumption of ressources (e.g. increasing unavailibility of energy and information) within the system can be compensated by export of entropy (e.g. import of availibility of energy and information) only to a limited degree. If the consumption exeeds the limit, among the various effects of increased entropy, "global warming" may be only one possible result. From a viewpoint of humans, the general effect of increasing entropy production beyond the limits of entropy compensation is the decreasing predictability of the processes within the biosphere. Practically this means: The availibility of ressources becomes less secure.

Redundancy

Redundancy (based on the definition of information definition of redundancy ISO 2382-16) is the gap between the maximum entropy which a system can handle and the entropy which the system actually is experiencing. In the econometrics of welfare economics, the Theil inequality measure is a redundancy, as it expresses the distance between the entropy of a theoretically even ressource distribution and the entropy of a measured ressource dirstribution.

Links

References




Unless indicated otherwise, the text in this article is either based on Wikipedia article "Entropy" or another language Wikipedia page thereof used under the terms of the GNU Free Documentation License; or on research by Jahsonic and friends. See Art and Popular Culture's copyright notice.

Personal tools