Gaussian Adaptation - A Theorem of Efficiency For Random Search

A Theorem of Efficiency For Random Search

The efficiency of Gaussian adaptation relies on the theory of information due to Claude E. Shannon (see information content). When an event occurs with probability P, then the information −log(P) may be achieved. For instance, if the mean fitness is P, the information gained for each individual selected for survival will be −log(P) – on the average - and the work/time needed to get the information is proportional to 1/P. Thus, if efficiency, E, is defined as information divided by the work/time needed to get it we have:

E = −P log(P).

This function attains its maximum when P = 1/e = 0.37. The same result has been obtained by Gaines with a different method.

E = 0 if P = 0, for a process with infinite mutation rate, and if P = 1, for a process with mutation rate = 0 (provided that the process is alive). This measure of efficiency is valid for a large class of random search processes provided that certain conditions are at hand.

1 The search should be statistically independent and equally efficient in different parameter directions. This condition may be approximately fulfilled when the moment matrix of the Gaussian has been adapted for maximum average information to some region of acceptability, because linear transformations of the whole process do not have an impact on efficiency.

2 All individuals have equal cost and the derivative at P = 1 is < 0.

Then, the following theorem may be proved:

All measures of efficiency, that satisfy the conditions above, are asymptotically proportional to –P log(P/q) when the number of dimensions increases, and are maximized by P = q exp(-1) (Kjellström, 1996 and 1999).

The figure above shows a possible efficiency function for a random search process such as Gaussian adaptation. To the left the process is most chaotic when P = 0, while there is perfect order to the right where P = 1.

In an example by Rechenberg, 1971, 1973, a random walk is pushed thru a corridor maximizing the parameter x1. In this case the region of acceptability is defined as a (n − 1)-dimensional interval in the parameters x2, x3, ..., xn, but a x1-value below the last accepted will never be accepted. Since P can never exceed 0.5 in this case, the maximum speed towards higher x1-values is reached for P = 0.5/e = 0.18, in agreement with the findings of Rechenberg.

A point of view that also may be of interest in this context is that no definition of information (other than that sampled points inside some region of acceptability gives information about the extension of the region) is needed for the proof of the theorem. Then, because, the formula may be interpreted as information divided by the work needed to get the information, this is also an indication that −log(P) is a good candidate for being a measure of information.

Read more about this topic:  Gaussian Adaptation

Famous quotes containing the words theorem, efficiency, random and/or search:

    To insure the adoration of a theorem for any length of time, faith is not enough, a police force is needed as well.
    Albert Camus (1913–1960)

    Nothing comes to pass in nature, which can be set down to a flaw therein; for nature is always the same and everywhere one and the same in her efficiency and power of action; that is, nature’s laws and ordinances whereby all things come to pass and change from one form to another, are everywhere and always; so that there should be one and the same method of understanding the nature of all things whatsoever, namely, through nature’s universal laws and rules.
    Baruch (Benedict)

    And catch the gleaming of a random light,
    That tells me that the ship I seek is passing, passing.
    Paul Laurence Dunbar (1872–1906)

    why
    Do our black faces search the empty sky?
    Is there something we have forgotten? some precious thing
    We have lost, wandering in strange lands?
    Arna Bontemps (1902–1973)