Loop Entropy

Loop entropy is the entropy lost upon bringing together two residues of a polymer within a prescribed distance. For a single loop, the entropy varies logarithmically with the number of residues in the loop


\Delta S = \alpha k_{B} \ln N \,

where is Boltzmann's constant and is a coefficient that depends on the properties of the polymer. This entropy formula corresponds to a power-law distribution for the probability of the residues contacting.

The loop entropy may also vary with the position of the contacting residues. Residues near the ends of the polymer are more likely to contact (quantitatively, have a lower ) than those in the middle (i.e., far from the ends), primarily due to excluded volume effects.

Read more about Loop Entropy:  Wang-Uhlenbeck Entropy

Famous quotes containing the word entropy:

    Just as the constant increase of entropy is the basic law of the universe, so it is the basic law of life to be ever more highly structured and to struggle against entropy.
    Václav Havel (b. 1936)