Role in The Statistical Definition of Entropy
Further information: Entropy (statistical thermodynamics)In statistical mechanics, the entropy S of an isolated system at thermodynamic equilibrium is defined as the natural logarithm of W, the number of distinct microscopic states available to the system given the macroscopic constraints (such as a fixed total energy E):
This equation, which relates the microscopic details, or microstates, of the system (via W) to its macroscopic state (via the entropy S), is the central idea of statistical mechanics. Such is its importance that it is inscribed on Boltzmann's tombstone.
The constant of proportionality k serves to make the statistical mechanical entropy equal to the classical thermodynamic entropy of Clausius:
One could choose instead a rescaled dimensionless entropy in microscopic terms such that
This is a rather more natural form; and this rescaled entropy exactly corresponds to Shannon's subsequent information entropy.
The characteristic energy kT is thus the heat required to increase the rescaled entropy by one nat.
Read more about this topic: Boltzmann Constant
Famous quotes containing the words role, definition and/or entropy:
“But however the forms of family life have changed and the number expanded, the role of the family has remained constant and it continues to be the major institution through which children pass en route to adulthood.”
—Bernice Weissbourd (20th century)
“It is very hard to give a just definition of love. The most we can say of it is this: that in the soul, it is a desire to rule; in the spirit, it is a sympathy; and in the body, it is but a hidden and subtle desire to possessafter many mysterieswhat one loves.”
—François, Duc De La Rochefoucauld (16131680)
“Just as the constant increase of entropy is the basic law of the universe, so it is the basic law of life to be ever more highly structured and to struggle against entropy.”
—Václav Havel (b. 1936)