Hebbian Theory - Generalization and Stability

Generalization and Stability

Hebb's Rule is often generalized as

or the change in the th synaptic weight is equal to a learning rate times the th input times the postsynaptic response . Often cited is the case of a linear neuron,

and the previous section's simplification takes both the learning rate and the input weights to be 1. This version of the rule is clearly unstable, as in any network with a dominant signal the synaptic weights will increase or decrease exponentially. However, it can be shown that for any neuron model, Hebb's rule is unstable. Therefore, network models of neurons usually employ other learning theories such as BCM theory, Oja's rule, or the Generalized Hebbian Algorithm.

Read more about this topic:  Hebbian Theory

Famous quotes containing the words generalization and and/or stability:

    The English have all the material requisites for the revolution. What they lack is the spirit of generalization and revolutionary ardour.
    Karl Marx (1818–1883)

    No one can doubt, that the convention for the distinction of property, and for the stability of possession, is of all circumstances the most necessary to the establishment of human society, and that after the agreement for the fixing and observing of this rule, there remains little or nothing to be done towards settling a perfect harmony and concord.
    David Hume (1711–1776)