Explanation
The concept of thermodynamic entropy arises from the second law of thermodynamics. It uses entropy to quantify the capacity of a system for change, namely that heat flows from a region of higher temperature to one with lower temperature, and to determine whether a thermodynamic process may occur.
Entropy is defined by two descriptions, first as a macroscopic relationship between heat flow into a system and the system's change in temperature, and second, on a microscopic level, as the natural logarithm of the number of microstates of a system.
Following the formalism of Clausius, the first definition can be mathematically stated as:
Where dS is the change in entropy, δq is the heat added to the system reversibly, and T is temperature. If the temperature is allowed to vary the equation must be integrated over the temperature path. This definition of entropy does not allow the determination of an absolute value, only of differences.
The second definition of entropy comes from statistical mechanics. The entropy of a particular macrostate is defined to be Boltzmann constant times the natural logarithm of the number of microstates corresponding to that macrostate, or mathematically
Where S is the entropy, kB is the Boltzmann constant, and Ω is the number of microstates.
The macrostate of a system is what we know about the system, for example the temperature, pressure, and volume of a gas in a box. For each set of values of temperature, pressure, and volume there are many arrangements of molecules which result in those values. The number of arrangements of molecules which could result in the same values for temperature, pressure and volume is the number of microstates.
The concept of energy is related to the first law of thermodynamics, which deals with the conservation of energy and under which the loss in heat will result in a decrease in the internal energy of the thermodynamic system. Thermodynamic entropy provides a comparative measure of the amount of this decrease in internal energy of the system and the corresponding increase in internal energy of the surroundings at a given temperature. A simple and more concrete visualization of the second law is that energy of all types changes from being localized to becoming dispersed or spread out, if it is not hindered from doing so. Entropy change is the quantitative measure of that kind of a spontaneous process: how much energy has flowed or how widely it has become spread out at a specific temperature.
Entropy has been developed to describe any of several phenomena, depending on the field and the context in which it is being used. Information entropy takes the mathematical concepts of statistical thermodynamics into areas of probability theory unconnected with heat and energy.
Read more about this topic: Introduction To Entropy
Famous quotes containing the word explanation:
“Auden, MacNeice, Day Lewis, I have read them all,
Hoping against hope to hear the authentic call . . .
And know the explanation I must pass is this
MYou cannot light a match on a crumbling wall.”
—Hugh MacDiarmid (18921978)
“What causes adolescents to rebel is not the assertion of authority but the arbitrary use of power, with little explanation of the rules and no involvement in decision-making. . . . Involving the adolescent in decisions doesnt mean that you are giving up your authority. It means acknowledging that the teenager is growing up and has the right to participate in decisions that affect his or her life.”
—Laurence Steinberg (20th century)
“Natural selection, the blind, unconscious, automatic process which Darwin discovered, and which we now know is the explanation for the existence and apparently purposeful form of all life, has no purpose in mind. It has no mind and no minds eye. It does not plan for the future. It has no vision, no foresight, no sight at all. If it can be said to play the role of the watchmaker in nature, it is the blind watchmaker.”
—Richard Dawkins (b. 1941)