Architecture
The basic architecture consists of three types of neuron layers: input, hidden, and output. In feed-forward networks, the signal flow is from input to output units, strictly in a feed-forward direction. The data processing can extend over multiple layers of units, but no feedback connections are present. Recurrent networks contain feedback connections. Contrary to feed-forward networks, the dynamical properties of the network are important. In some cases, the activation values of the units undergo a relaxation process such that the network will evolve to a stable state in which these activations do not change anymore.
In other applications, the changes of the activation values of the output neurons are significant, such that the dynamical behavior constitutes the output of the network. Other neural network architectures include adaptive resonance theory maps and competitive networks.
Read more about this topic: Neural Network
Famous quotes containing the word architecture:
“For it is not metres, but a metre-making argument, that makes a poem,a thought so passionate and alive, that, like the spirit of a plant or an animal, it has an architecture of its own, and adorns nature with a new thing.”
—Ralph Waldo Emerson (18031882)
“They can do without architecture who have no olives nor wines in the cellar.”
—Henry David Thoreau (18171862)
“In short, the building becomes a theatrical demonstration of its functional ideal. In this romanticism, High-Tech architecture is, of course, no different in spiritif totally different in formfrom all the romantic architecture of the past.”
—Dan Cruickshank (b. 1949)