Gibbs Entropy Formula
The macroscopic state of the system is defined by a distribution on the microstates that are accessible to a system in the course of its thermal fluctuations. So the entropy is defined over two different levels of description of the given system. The entropy is given by the Gibbs entropy formula, named after J. Willard Gibbs. For a classical system (i.e., a collection of classical particles) with a discrete set of microstates, if is the energy of microstate i, and is its probability that it occurs during the system's fluctuations, then the entropy of the system is
Entropy changes for systems in a canonical state
A system with a well-defined temperature, i.e., one in thermal equilibrium with a thermal reservoir, has a probability of being in a microstate i given by Boltzmann's distribution.
Changes in the entropy caused by changes in the external constraints are then given by:
where we have twice used the conservation of probability, ∑ dpi=0 .
Now, ∑i d (Ei pi) is the expectation value of the change in the total energy of the system.
If the changes are sufficiently slow, so that the system remains in the same microscopic state, but the state slowly (and reversibly) changes, then ∑i (dEi) pi is the expectation value of the work done on the system through this reversible process, dwrev.
But from the first law of thermodynamics, δE = δw +δq. Therefore,
In the thermodynamic limit, the fluctuation of the macroscopic quantities from their average values becomes negligible; so this reproduces the definition of entropy from classical thermodynamics, given above.
The quantity is a physical constant known as Boltzmann's constant, which, like the entropy, has units of heat capacity. The logarithm is dimensionless.
This definition remains valid even when the system is far away from equilibrium. Other definitions assume that the system is in thermal equilibrium, either as an isolated system, or as a system in exchange with its surroundings. The set of microstates on which the sum is to be done is called a statistical ensemble. Each statistical ensemble (micro-canonical, canonical, grand-canonical, etc.) describes a different configuration of the system's exchanges with the outside, from an isolated system to a system that can exchange one more quantity with a reservoir, like energy, volume or molecules. In every ensemble, the equilibrium configuration of the system is dictated by the maximization of the entropy of the union of the system and its reservoir, according to the second law of thermodynamics (see the statistical mechanics article).
Neglecting correlations between the different possible states (or, more generally, neglecting statistical dependencies between states) will lead to an overestimate of the entropy. These correlations occur in systems of interacting particles, that is, in all systems more complex than an ideal gas.
This S is almost universally called simply the entropy. It can also be called the statistical entropy or the thermodynamic entropy without changing the meaning. Note the above expression of the statistical entropy is a discretized version of Shannon entropy. The von Neumann entropy formula is an extension of the Gibbs entropy formula to the quantum mechanical case.
It has been shown that the Gibb's Entropy is numerically equal to the experimental entropy
Read more about this topic: Entropy (statistical Thermodynamics)
Famous quotes containing the words entropy and/or formula:
“Just as the constant increase of entropy is the basic law of the universe, so it is the basic law of life to be ever more highly structured and to struggle against entropy.”
—Václav Havel (b. 1936)
“I feel like a white granular mass of amorphous crystalsmy formula appears to be isomeric with Spasmotoxin. My aurochloride precipitates into beautiful prismatic needles. My Platinochloride develops octohedron crystals,with a fine blue florescence. My physiological action is not indifferent. One millionth of a grain injected under the skin of a frog produced instantaneous death accompanied by an orange blossom odor.”
—Lafcadio Hearn (18501904)