Gaussian Adaptation - Computer Simulation of Gaussian Adaptation

Computer Simulation of Gaussian Adaptation

Thus far the theory only considers mean values of continuous distributions corresponding to an infinite number of individuals. In reality however, the number of individuals is always limited, which gives rise to an uncertainty in the estimation of m and M (the moment matrix of the Gaussian). And this may also affect the efficiency of the process. Unfortunately very little is known about this, at least theoretically.

The implementation of normal adaptation on a computer is a fairly simple task. The adaptation of m may be done by one sample (individual) at a time, for example

m(i + 1) = (1 – a) m(i) + ax

where x is a pass sample, and a < 1 a suitable constant so that the inverse of a represents the number of individuals in the population.

M may in principle be updated after every step y leading to a feasible point

x = m + y according to:
M(i + 1) = (1 – 2b) M(i) + 2byyT,

where yT is the transpose of y and b << 1 is another suitable constant. In order to guarantee a suitable increase of average information, y should be normally distributed with moment matrix μ2M, where the scalar μ > 1 is used to increase average information (information entropy, disorder, diversity) at a suitable rate. But M will never be used in the calculations. Instead we use the matrix W defined by WWT = M.

Thus, we have y = Wg, where g is normally distributed with the moment matrix μU, and U is the unit matrix. W and WT may be updated by the formulas

W = (1 – b)W + bygT and WT = (1 – b)WT + bgyT

because multiplication gives

M = (1 – 2b)M + 2byyT,

where terms including b2 have been neglected. Thus, M will be indirectly adapted with good approximation. In practice it will suffice to update W only

W(i + 1) = (1 – b)W(i) + bygT.

This is the formula used in a simple 2-dimensional model of a brain satisfying the Hebbian rule of associative learning; see the next section (Kjellström, 1996 and 1999).

The figure below illustrates the effect of increased average information in a Gaussian p.d.f. used to climb a mountain Crest (the two lines represent the contour line). Both the red and green cluster have equal mean fitness, about 65%, but the green cluster has a much higher average information making the green process much more efficient. The effect of this adaptation is not very salient in a 2-dimensional case, but in a high-dimensional case, the efficiency of the search process may be increased by many orders of magnitude.

Read more about this topic:  Gaussian Adaptation

Famous quotes containing the words computer, simulation and/or adaptation:

    The archetype of all humans, their ideal image, is the computer, once it has liberated itself from its creator, man. The computer is the essence of the human being. In the computer, man reaches his completion.
    Friedrich Dürrenmatt (1921–1990)

    Life, as the most ancient of all metaphors insists, is a journey; and the travel book, in its deceptive simulation of the journey’s fits and starts, rehearses life’s own fragmentation. More even than the novel, it embraces the contingency of things.
    Jonathan Raban (b. 1942)

    In youth the human body drew me and was the object of my secret and natural dreams. But body after body has taken away from me that sensual phosphorescence which my youth delighted in. Within me is no disturbing interplay now, but only the steady currents of adaptation and of sympathy.
    Haniel Long (1888–1956)