Entropy (arrow of Time) - Correlations

Correlations

An important difference between the past and the future is that in any system (such as a gas of particles) its initial conditions are usually such that its different parts are uncorrelated, but as the system evolves and its different parts interact with each other, they become correlated. For example, whenever dealing with a gas of particles, it is always assumed that its initial conditions are such that there is no correlation between the states of different particles (i.e. the speeds and locations of the different particles are completely random, up to the need to conform with the macrostate of the system). This is closely related to the Second Law of Thermodynamics.

Take for example (experiment A) a closed box which is, at the beginning, half-filled with ideal gas. As time passes, the gas obviously expands to fill the whole box, so that the final state will be a box full of gas. This is an irreversible process, since if the box is full at the beginning (experiment B), it will not become only half-full later, except for the very unlikely situation where the gas particles have very special locations and speeds. But this is precisely because we always assume that the initial conditions are such that the particles have random locations and speeds. This is not correct for the final conditions of the system, because the particles have interacted between themselves, so that their locations and speeds have become dependent on each other, i.e. correlated. This can be understood if we look at experiment A backwards in time, which we'll call experiment C: now we begin with a box full of gas, but the particles do not have random locations and speeds; rather, their locations and speeds are so particular, that after some time they all move to one half of the box, which is the final state of the system (this is the initial state of experiment A, because now we're looking at the same experiment backwards!). The interactions between particles now do not create correlations between the particles, but in fact turn them into (at least seemingly) random, "canceling" the pre-existing correlations. The only difference between experiment C (which defies the Second Law of Thermodynamics) and experiment B (which obeys the Second Law of Thermodynamics) is that in the former the particles are uncorrelated at the end, while in the latter the particles are uncorrelated at the beginning.

In fact, if all the microscopic physical processes are reversible (see discussion below), then the Second Law of Thermodynamics can be proven for any isolated system of particles with initial conditions in which the particles states are uncorrelated. In order to do this one must acknowledge the difference between the measured entropy of a system - which is dependent only on its macrostate (its volume, temperature etc.) - and its information entropy (also called Kolmogorov complexity), which is the amount of information (number of computer bits) needed to describe the exact microstate of the system. The measured entropy is independent of correlations between particles in the system, because they do not affect its macrostate, but the information entropy does depend on them, because correlations lower the randomness of the system and thus lowers the amount of information needed to describe it. Therefore, in the absence of such correlations the two entropies are identical, but otherwise the information entropy will be smaller than the measured entropy, and the difference can be used as a measure of the amount of correlations.

Now, by Liouville's theorem, time-reversal of all microscopic processes implies that the amount of information needed to describe the exact microstate of an isolated system (its information-theoretic joint entropy) is constant in time. This joint entropy is equal to the marginal entropy (entropy assuming no correlations) plus the entropy of correlation (mutual entropy, or its negative mutual information). If we assume no correlations between the particles initially, then this joint entropy is just the marginal entropy which is just the initial thermodynamic entropy of the system, divided by Boltzmann's constant. However, if these are indeed the initial conditions (and this is a crucial assumption), then such correlations will form with time. In other words, there will be a decreasing mutual entropy (or increasing mutual information), and for a time which is not too long - the correlations (mutual information) between particles will only increase with time; therefore, the thermodynamic entropy, which is proportional to the marginal entropy, must also increase with time (note that "not too long" in this context is relative to the time needed, in a classical version of the system, for it to pass through all its possible microstates - a time which can be roughly estimated as, where is the time between particle collisions and S is the system's entropy. In any practical case this time is huge compared to everything else). Note that the correlation between particles is not a fully objective quantity - one cannot measure the mutual entropy, one can only measure its change, assuming one can measure a microstate. Thermodynamics is restricted to the case where microstates cannot be distinguished, which means that only the marginal entropy, proportional to the thermodynamic entropy, can be measured, and, in a practical sense, always increases.

Read more about this topic:  Entropy (arrow Of Time)