Entropy

Entropy is a thermodynamic property that is the measure of a system’s thermal energy per unit temperature that is unavailable for doing useful work. Perhaps the most familiar manifestation of entropy is that, following the laws of thermodynamics, entropy of a closed system always increases and in heat transfer situations, heat energy is transferred from higher temperature components to lower temperature components. In thermally isolated systems, entropy runs in one direction only (it is not a reversible process). One can measure the entropy of a system to determine the energy not available for work in a thermodynamic process, such as energy conversion, engines, or machines. Such processes and devices can only be driven by convertible energy, and have a theoretical maximum efficiency when converting energy to work. During this work, entropy accumulates in the system, which then dissipates in the form of waste heat.

In classical thermodynamics, the concept of entropy is defined phenomenologically by the second law of thermodynamics, which states that the entropy of an isolated system always increases or remains constant. Thus, entropy is also a measure of the tendency of a process, such as a chemical reaction, to be entropically favored, or to proceed in a particular direction. It determines that thermal energy always flows spontaneously from regions of higher temperature to regions of lower temperature, in the form of heat. These processes reduce the state of order of the initial systems, and therefore entropy is an expression of disorder or randomness. This is the basis of the modern microscopic interpretation of entropy in statistical mechanics, where entropy is defined as the amount of additional information needed to specify the exact physical state of a system, given its thermodynamic specification. The second law is then a consequence of this definition and the fundamental postulate of statistical mechanics.

Thermodynamic entropy has the dimension of energy divided by temperature, which has a unit of joules per kelvin (J/K) in the International System of Units.

The term entropy was coined in 1865 by Rudolf Clausius based on the Greek εντροπία, a turning toward, from εν- (in) and τροπή (turn, conversion).

Read more about Entropy:  Thermodynamical and Statistical Descriptions, Second Law of Thermodynamics, Definitions and Descriptions, History, Interdisciplinary Applications of Entropy

Famous quotes containing the word entropy:

    Just as the constant increase of entropy is the basic law of the universe, so it is the basic law of life to be ever more highly structured and to struggle against entropy.
    Václav Havel (b. 1936)