Generalization and Stability
Hebb's Rule is often generalized as
or the change in the th synaptic weight is equal to a learning rate times the th input times the postsynaptic response . Often cited is the case of a linear neuron,
and the previous section's simplification takes both the learning rate and the input weights to be 1. This version of the rule is clearly unstable, as in any network with a dominant signal the synaptic weights will increase or decrease exponentially. However, it can be shown that for any neuron model, Hebb's rule is unstable. Therefore, network models of neurons usually employ other learning theories such as BCM theory, Oja's rule, or the Generalized Hebbian Algorithm.
Read more about this topic: Hebbian Theory
Famous quotes containing the words generalization and and/or stability:
“The English have all the material requisites for the revolution. What they lack is the spirit of generalization and revolutionary ardour.”
—Karl Marx (18181883)
“...I feel anxious for the fate of our monarchy, or democracy, or whatever is to take place. I soon get lost in a labyrinth of perplexities; but, whatever occurs, may justice and righteousness be the stability of our times, and order arise out of confusion. Great difficulties may be surmounted by patience and perseverance.”
—Abigail Adams (17441818)