Neural Network - Architecture

Architecture

The basic architecture consists of three types of neuron layers: input, hidden, and output. In feed-forward networks, the signal flow is from input to output units, strictly in a feed-forward direction. The data processing can extend over multiple layers of units, but no feedback connections are present. Recurrent networks contain feedback connections. Contrary to feed-forward networks, the dynamical properties of the network are important. In some cases, the activation values of the units undergo a relaxation process such that the network will evolve to a stable state in which these activations do not change anymore.

In other applications, the changes of the activation values of the output neurons are significant, such that the dynamical behavior constitutes the output of the network. Other neural network architectures include adaptive resonance theory maps and competitive networks.

Read more about this topic:  Neural Network

Famous quotes containing the word architecture:

    The two elements the traveler first captures in the big city are extrahuman architecture and furious rhythm. Geometry and anguish. At first glance, the rhythm may be confused with gaiety, but when you look more closely at the mechanism of social life and the painful slavery of both men and machines, you see that it is nothing but a kind of typical, empty anguish that makes even crime and gangs forgivable means of escape.
    Federico García Lorca (1898–1936)

    It seems a fantastic paradox, but it is nevertheless a most important truth, that no architecture can be truly noble which is not imperfect.
    John Ruskin (1819–1900)

    For it is not metres, but a metre-making argument, that makes a poem,—a thought so passionate and alive, that, like the spirit of a plant or an animal, it has an architecture of its own, and adorns nature with a new thing.
    Ralph Waldo Emerson (1803–1882)