Introduction To Entropy - Explanation

Explanation

The concept of thermodynamic entropy arises from the second law of thermodynamics. It uses entropy to quantify the capacity of a system for change, namely that heat flows from a region of higher temperature to one with lower temperature, and to determine whether a thermodynamic process may occur.

Entropy is defined by two descriptions, first as a macroscopic relationship between heat flow into a system and the system's change in temperature, and second, on a microscopic level, as the natural logarithm of the number of microstates of a system.

Following the formalism of Clausius, the first definition can be mathematically stated as:

Where dS is the change in entropy, δq is the heat added to the system reversibly, and T is temperature. If the temperature is allowed to vary the equation must be integrated over the temperature path. This definition of entropy does not allow the determination of an absolute value, only of differences.

The second definition of entropy comes from statistical mechanics. The entropy of a particular macrostate is defined to be Boltzmann constant times the natural logarithm of the number of microstates corresponding to that macrostate, or mathematically

Where S is the entropy, kB is the Boltzmann constant, and Ω is the number of microstates.

The macrostate of a system is what we know about the system, for example the temperature, pressure, and volume of a gas in a box. For each set of values of temperature, pressure, and volume there are many arrangements of molecules which result in those values. The number of arrangements of molecules which could result in the same values for temperature, pressure and volume is the number of microstates.

The concept of energy is related to the first law of thermodynamics, which deals with the conservation of energy and under which the loss in heat will result in a decrease in the internal energy of the thermodynamic system. Thermodynamic entropy provides a comparative measure of the amount of this decrease in internal energy of the system and the corresponding increase in internal energy of the surroundings at a given temperature. A simple and more concrete visualization of the second law is that energy of all types changes from being localized to becoming dispersed or spread out, if it is not hindered from doing so. Entropy change is the quantitative measure of that kind of a spontaneous process: how much energy has flowed or how widely it has become spread out at a specific temperature.

Entropy has been developed to describe any of several phenomena, depending on the field and the context in which it is being used. Information entropy takes the mathematical concepts of statistical thermodynamics into areas of probability theory unconnected with heat and energy.

Read more about this topic:  Introduction To Entropy

Famous quotes containing the word explanation:

    Herein is the explanation of the analogies, which exist in all the arts. They are the re-appearance of one mind, working in many materials to many temporary ends. Raphael paints wisdom, Handel sings it, Phidias carves it, Shakspeare writes it, Wren builds it, Columbus sails it, Luther preaches it, Washington arms it, Watt mechanizes it. Painting was called “silent poetry,” and poetry “speaking painting.” The laws of each art are convertible into the laws of every other.
    Ralph Waldo Emerson (1803–1882)

    We live between two worlds; we soar in the atmosphere; we creep upon the soil; we have the aspirations of creators and the propensities of quadrupeds. There can be but one explanation of this fact. We are passing from the animal into a higher form, and the drama of this planet is in its second act.
    W. Winwood Reade (1838–1875)

    How strange a scene is this in which we are such shifting figures, pictures, shadows. The mystery of our existence—I have no faith in any attempted explanation of it. It is all a dark, unfathomed profound.
    Rutherford Birchard Hayes (1822–1893)